As a software engineer in AI, it’s rewarding to work on projects that apply artificial intelligence in ways that purely, clearly help people. Vocable — a free augmentative and alternative communication (AAC) iPhone and iPad app that helps speech-impaired patients communicate with their caregivers — is that kind of a project.
The WillowTree team developed Vocable to help one of our own: former WillowTree designer Matt Kubota’s partner Ana was diagnosed with Guillain-Barré Syndrome. The rare autoimmune disorder attacked her peripheral nervous system, causing paralysis in the arms, legs, and face, leaving Ana unable to speak.
Ana and her caregivers initially looked to existing augmentative and alternative communication (AAC) devices, but the available options were either expensive and bulky, or else rudimentary and outdated (at first, she could only painstakingly communicate by blinking as caregivers pointed to individual letters on an alphabet poster).
Through head and face tracking combined with integrated conversational AI, we developed the Vocable iOS app to help nonverbal and nonspeaking individuals like Ana communicate with those around them. It’s been a powerful experience watching the app change people’s lives as they interact with the world in new ways.
Now, on the Vision Pro, Vocable has evolved into something even more powerful. Spatial computing, combined with the many accessibility features Apple gives us, brings disabled users’ virtual and physical worlds together in a singular layer. At the same time, it gives users new and familiar options for how to interact with both of those worlds.
After the first iPhone was launched in 2007, WillowTree was proud to be one of the first developers to release digital products on the App Store. We’re equally proud that Vocable is one of the first apps optimized and available on the Vision Pro App Store.
The Vision Pro has an impressive amount of accessibility options built into it, starting with eye tracking as the default for navigation.
Also by default, users can click by tapping their index and thumb together, but Vocable leverages Apple’s advanced accessibility features like Dwell Control, which allows paralyzed and immobilized users (or anyone who can’t use their hands) to navigate and click by blinking or dwelling on a portion of the interface. Head and hand tracking also bring enhanced accessibility, such as the Pointer Control feature where the finger or wrist can be used like a laser pointer.
This means on the Vision Pro, speech-impaired individuals and users with other disabilities have greater agency over how they use Vocable and, therefore, greater ease in communicating with their caregivers.
The Vision Pro also layers Vocable’s interface — its custom menus of words and phrases — directly into the users’ field of vision. Rather than focusing on a separate screen or device, Vocable is superimposed into their first-person POV.
This makes viewing and selecting responses happen more smoothly than using any other "separate" AAC device, where constraints such as screen size and fewer accessibility options (e.g., just head and face tracking) create bottlenecks. It also allows communication-vulnerable users to remain visually engaged in the world around them.
Thanks to Vision Pro’s enhanced accessibility features, Vocable now brings speech-impaired individuals and their caregivers closer to the experience of ordinary conversation.
Just as Vocable makes communication flow more smoothly between speech-impaired people and their caregivers, the Vision Pro makes it easy for patients to flow between their physical and digital environments in one singular space.
Vision Pro essentially expands the typical laptop or desktop monitor to near-infinite real estate. That means Vocable’s semi-transparent Soundboard interface can be placed alongside other apps and windows, letting speech-impaired users communicate with others while they browse Safari, listen to podcasts, or stream videos. Wearers simply place Vocable in their virtual space, shifting their core focus to various apps or the physical world on the other side of the Vision Pro glass.
When a caregiver enters the room to ask, “What would you like for dinner?” the user can respond and then return to whatever they were doing before. This provides disabled users the same kind of spatial flexibility that an executive might enjoy while using the Vision Pro as a wearable workspace.
Skepticism about the Apple Vision Pro is understandable. It represents a convergence of new technology, unlike anything the world has seen in recent years, and the pricetag is certainly a consideration. But from a software engineering perspective, there’s so much to be excited about — and lots of reasons to be optimistic about the Vision Pro’s staying power.
To start, the technology Apple has brought to market is so far ahead of existing AR/VR headsets that the Vision Pro looks likely to reset the space. Not to mention, Apple has been seeding the technologies built into the Vision Pro for years now, showing its deep and wide-ranging potential for future applications.
That said, how the Vision Pro will evolve as Apple better understands the needs of enterprise versus consumer users probably won’t be as linear as, say, the iPhone. Businesses need to be ready with versions of their apps optimized to work across Apple’s evolving ecosystem.
If you’re ready to optimize your existing app for Vision Pro, or develop a new digital experience specifically for spatial computing use cases, WillowTree’s Vision Pro Accelerator can help. Get started by exploring our digital product and delivery services and reach out to learn more about how we can migrate and customize your mobile app to the Vision Pro.
One email, once a month.