The Many Ways We Speak: From ASL to Advanced Technology; the Evolution of Accessibility.

On August 11, 2025, CPWD’s Beyond Vision program consumers visited the Denver Art Museum for a unique, inclusive experience: an Audio Described and Tactile Tour that brought exhibits to life through touch, narration, and shared storytelling. What’s special about this is that the participants were blind. By providing the opportunity for alternative sensory input and experience, they were able to explore and perceive the artwork and facility. 

Experiences like this are becoming more common. Technology is evolving;new applications that increase accessibility are being developed, and are more available and affordable than ever. The Beyond Vision group also went to an audio described ballet, and an audio described movie during that past year. New opportunities to engage in community, entertainment, and life in general, are available as a result of technological innovation. In the past, a blind person could either “watch” a movie by listening to the audio only, or having someone there to describe the visual elements. Today, a set of headphones with accurate voice over descriptions allows groups of blind people to go to the movies together. More than accessibility, this also fosters friendships, belonging, reduces isolation and related challenges such as anxiety, depression, and injury. 

For decades, we’ve been retrofitting or making accommodations to create accessibility where previously there was none. Think adding a ramp where once there were only stairs, or hiring an interpreter to sign where previously deaf individuals either had a transcript, could read lips, or just silently watched a talking head. Today,  our collective understanding of accessibility is evolving.  We’re shifting from a model of retrofitting access to one that builds it in from the beginning. Most new buildings include accessibility features; smartphones have a slew of accessibility options such as voice over and magnification. Transportation, communications, websites, sidewalks, and more are all being designed with accessibility features. This is a fantastic shift in design approach and has come as a result of a lot of hard work and advocacy by people with disabilities over many years. 

If we look back at the origins of accessibility, we will see that many were developed, at first, out of necessity. During the Civil War, for the first time, we recognized that soldiers with limb loss and other war-acquired disabilities needed adaptive equipment (crutches, wheelchairs, and ramps, for example). Similarly, one of the oldest and long-lasting accessible tools is American Sign Language (ASL), developed so that people who are deaf could communicate with each other and others who are not deaf. 

ASL: A Living Language and a Legacy of Innovation

Image: A person makes the sign for “Interpreter” in ASL.

ASL is the oldest alternative language for people who are deaf. It originated as a response to exclusion from spoken systems and began to formally take shape in the early 19th century, when Deaf educator Thomas Hopkins Gallaudet partnered with French Deaf teacher Laurent Clerc to establish the first American school for the Deaf in 1817. Drawing from French Sign Language, regional sign systems, and home signs used across the U.S., ASL emerged as a distinct, living language. It has since grown into a fully developed system with its own grammar, rhythm, visual syntax, and cultural richness.

It’s estimated that 250,000 to 500,000 people use ASL fluently in the United States, with millions more having some degree of familiarity or interest in learning. And thanks to a growing emphasis on inclusive education, digital tools, and Deaf advocacy, that number continues to rise. Its very existence proves that language doesn’t have to be spoken or written to be powerful. In fact, ASL’s visual-spatial structure has inspired a broader movement in accessibility design, reminding us that human expression comes in many forms, and all of them are valid.

Historically, communication was narrowly defined. If you couldn’t speak or read the way others did, you were left out. But that mindset is changing. Today, communication is increasingly seen as multimodal and collaborative:  visual, auditory, tactile, written, signed, and beyond. ASL set the example of this shift. 

It’s not just a translation of English into hand motions; it’s a completely different way of structuring and expressing ideas, full of nuance, emotion, humor, and poetry. And because of ASL, more people are beginning to understand that there’s no single “correct” way to communicate, only more ways to be understood.

We see this evolution reflected in everything from education to public signage to social media. More schools are offering ASL as a second language. More events are being interpreted. More museums and public institutions are designing programs with accessibility in mind. These changes are happening because of decades of advocacy, creativity, and persistence from Deaf leaders and allies.


From Hands to Hardware: Accessible Technology


As technology continues to evolve, we’re witnessing a collective shift. The core principles of ASL are not only being acknowledged but actively woven into the fabric of how we design for connection. Communication is becoming more visual, embodied, adaptive, and inclusive.

Across the tech landscape, real-time gesture recognition is quickly advancing. New systems are being trained to interpret sign language through motion tracking and machine learning, offering potential breakthroughs for non-verbal individuals and expanding accessibility in digital spaces. At the same time, mainstream platforms like Zoom, YouTube, and Meta are incorporating features like sign language interpreter overlays, customizable captions, and user-controlled visual display options. These tools make participation easier and more intuitive for a wider range of users. Even for individuals without disabilities, symbols, such as emojis and slang acronyms are representing not just language, but also relationships, feelings, and swaths of communication that would otherwise take sentences to speak. 

And while spoken and visual forms of accessible communication continue to evolve, there's growing interest in a third way: communication through touch. Known as haptic technology, this field is opening powerful new possibilities for people who are DeafBlind, have low vision, or benefit from sensory-based interaction. Haptic technologies communicate feedback or messages through felt experiences, such as different vibrations or intensities, or changes in the feeling of an object. For example, a smart cane has ultrasonic sensors that detect the environment and interpret distance and directional information. Feedback is sent to the cane user through vibrations or other touch stimuli. For example, if a blind person using a smart cane gets too close to a curb, the cane might vibrate strongly to let them know there’s a risk. Other examples include refreshable braille, that changes in realtime as the person reads; haptic vests and other garments to assist with navigation; robotic devices that stabilize a person’s shakiness, and much more. 

Early haptic devices such as the BuzzClip and Sunu Band helped pave the way. While both are now discontinued, they showed the world what was possible: compact, wearable tools that used ultrasonic sensors and haptic feedback to alert users to nearby obstacles. Their impact lives on in the generation of innovation they inspired.

One of the most promising examples available today is the feelSpace naviBelt, a tactile navigation belt that vibrates around the wearer’s waist to indicate direction. It acts like a “compass for the body,” guiding users intuitively through space. Though primarily distributed in Europe, individuals in the U.S. can request a trial unit or purchase one directly. The device connects via Bluetooth to a smartphone app and has been used in both everyday mobility and research settings.

Meanwhile, U.S. researchers are pushing forward with cutting-edge prototypes. At NYU, a vibration-based belt is being tested to help users avoid collisions and move smoothly through environments. 

In another project launched in 2022, SmartBelt, researchers have developed a wearable system that detects where a sound is coming from and then vibrates on the corresponding part of the belt, offering a new way to localize sound through touch. While SmartBelt translates sound direction into vibrations on a wearable belt, the NYU haptic belt is designed specifically for navigation and obstacle awareness for people with vision loss, combining audio cues and tactile feedback to guide movement. Both are part of a growing ecosystem of wearable haptic devices that push the boundaries of how touch can be communication.

Image: The graph represents the Virtual Whiskers system and its main components. It includes several sensors such as a camera, GNSS receiver, IMU, and microphone. The processing unit is the Nvidia Jetson. The feedback components consist of a haptic feedback belt and a bone-conduction headset.

The Virtual Whiskers project goes even further, using modular vibration units to guide users toward open paths or alert them to obstacles, replicating some of the spatial awareness that a white cane or guide dog might provide. This is used through a belt, backpack, and headset system.

Across the globe, other innovations are taking shape: smart vests with embedded vibration actuators (like those from the SUITCEYES project), gloves that communicate through finger-based vibration, and smart glasses with AI-powered haptic cues at the temples.

Most of these systems are still in development, but they represent a rapidly expanding future where touch becomes a fully recognized channel of communication and connection. 


What unites these innovations is a design philosophy rooted in the same spatial and embodied awareness that defines ASL. Here, communication is spoken or seen, and also felt. These tools are part of a growing movement to center access through sensation, interaction, and human connection. What all of these tools have in common is their grounding in the same kinetic, spatial intelligence that ASL has always embodied. 

Where We’re Headed: From Compliance to Culture

That August outing to the Denver Art Museum demonstrated inclusive access, while also modeling inclusive participation. Participants gave real-time feedback to museum staff, offered insights into what worked and what could be improved, and reflected on how these outings inspired them to get back into the world, with confidence and joy.

It’s a reminder that access is a conversation, not a checklist. When people with disabilities are part of designing the experience, the outcome is richer for everyone. Through this lens, we see how tactile tours, audio description, braille menus, and visual alerts aren’t just “accommodations.” They’re expressions of a broader truth: that everyone deserves to feel seen, heard, and invited in.

What’s most exciting is how far we’ve come, and also where we’re continuing to grow toward:

  • More employers are offering ASL training and accessible communication protocols. In Colorado, CDLE now offers on‑demand ASL interpreting via an app at workforce centers, demonstrating the state’s commitment to accessible employment services. At the national level, organizations like Walmart are providing paid ASL training at distribution sites, and the DLA Aviation arm of the federal government recently implemented a 9‑week ASL introduction program for staff. Even the NYPD now includes ASL in its recruit training to better serve Deaf community members in critical situations. Meanwhile, grassroots efforts like ‘ASL Lunch & Learn’ at U.S. shipyards offer peer-based learning opportunities, contributing to stronger Deaf–hearing workplace cohesion.

Image: A group of consumers participates in a white cane training at CPWD.

  • More platforms and public services are centering accessibility as part of their design philosophy instead of as an afterthought.

  • More community spaces, like CPWD, are modeling what inclusive access looks like in action, from the Beyond Vision program to tech trainings to peer support groups.

Even research supports this shift. Studies show that early exposure to ASL boosts cognitive development in Deaf and hearing children alike. Brain imaging reveals that sign language activates the same linguistic centers as spoken language. And growing data shows that accessibility is not just good practice, it’s good policy, improving outcomes across education, employment, and wellbeing. While challenges still exist, we’re choosing here to recognize the energy, creativity, and collaboration that’s transforming communication across all sectors.

From the fingertips of early Deaf storytellers to the touchscreen apps of today, ASL has shaped how we think about communication. And it continues to lead the way as a language, and also as a philosophy of equity, community, and shared humanity.


As we move forward, let’s take inspiration from ASL and the communities that use it. Let’s keep creating spaces that celebrate differences, support participation, and design for real inclusion. Because when everyone is included, everyone benefits.

Next
Next

The Power of Possibility: CPWD’s Core Services and the Independent Living Legacy