Building AlEye Together: Our Conversation on Blind Table Talk

AlEye on Blind Table Talk and why inclusive innovation means building with the community—not just for it

We recently had the incredible honor of being featured on Blind Table Talk, a podcast that amplifies voices from the blind and low vision community while diving into the innovations shaping the future of accessibility. Hosted by the thoughtful and sharp Christina Ford and Sayyida Hillard, the show gave us the chance to share the story behind AlEye—and why we believe inclusive technology must be co-created with the communities it’s meant to serve.

From the moment we launched HapWare, our mission has been clear: to bridge the communication gap by making nonverbal cues—facial expressions, emotions, gestures, and body language—accessible through wearable haptic feedback. 

On the podcast, we dove into how this works—and more importantly, why it matters. One of the topics we dove into was “How do you make sure this evolves with the blind community?”

Our answer was simple: we’re building AlEye alongside our users, not for them in a vacuum. Every iteration of AlEye has been shaped by feedback from real people—beta testers, advocates, educators, and accessibility experts—who help us make it more useful, intuitive, and empowering with every step.

One of the first use cases we designed AlEye around was job interviews. It’s already a high-stakes environment, and our co-founder Bryan Duarte, recalled attending a career fair after graduating college and wishing he knew when someone was walking toward him, nodding in agreement, smiling, or disengaging. Those subtle cues can dramatically change how confident someone feels in the moment—and AlEye was born in part from wanting to bridge that gap.

We’ve since seen that original vision grow through a wide range of real-world experiences. In one pilot, a user shared how AlEye made a major difference during a Zoom meeting—helping them stay connected to the emotional energy in a virtual room where nonverbal signals are often missed. Working with high school students, we witnessed firsthand how empowering it can be to have access to emotional context during everyday classroom interactions with peers. We’ve placed a strong focus on supporting the transition-age population, and the results have been encouraging: in testing, students were able to recognize and interpret nonverbal cues with 99% accuracy after just two minutes of training.

This feedback has driven not only which cues we prioritize (like handshakes, head nods, and expressions of confusion), but also how we deliver them—using haptics that are distinguishable, subtle, and customizable. It’s why we’ve introduced Presets, giving users the power to choose which cues they want depending on the setting: a classroom, a date, a staff meeting, or a networking event.

We also talked about the bigger picture: what does it mean to create tech that isn’t just smart, but explainable? What does autonomy look like when someone has full control over what cues they want to receive, and when? These are the questions that guide our work—not just to make things “accessible,” but to make people feel seen, respected, and in control.

Being invited onto Blind Table Talk was more than a media moment. It was a chance to reflect on how far we’ve come, how much further we can go, and how powerful it is to be in conversation with people who deeply understand the importance of accessibility and innovation working hand in hand.

🔗 Watch the full episode on YouTube:
https://youtu.be/3XtRyTovTJo?si=lyIpbcaKMq7vdAqu