The Blind Technologist Behind AlEye and a Community First Approach

Real-Time Access, Grounded in Lived Experience and Purpose

Hello Everyone,

This week, we're diving into how AlEye was co-developed alongside the Blind, Low Vision, Deaf-blind, and Neurodiverse communities—and we’re excited to introduce someone who’s been central to that mission: our CTO and NFB leader, Dr. Bryan Duarte.

👨‍💻 Meet Our CTO: Bryan Duarte, Ph.D.

At the heart of HapWare is someone who understands communication barriers firsthand: our co-founder and Chief Technology Officer, Dr. Bryan Duarte.

After losing his sight in a motorcycle accident at 18, Bryan dedicated his life to building technology that restores access to the world around us. With a Ph.D. in Computer Science and deep expertise in haptics, sensory substitution, and assistive technology, Bryan is leading the charge to make AlEye not just smart, but intuitive and human-centered.

He’s not just a technologist—he’s a leader in the community. As President of the NFB Loudoun County Chapter, Bryan ensures that AlEye is built with direct input from the Blind and Low Vision (BLV) community. His personal journey continues to shape how we design, test, and deliver meaningful tech.

🤝 Built With the Community, For the Community

Before the first haptic pattern was mapped or the first prototype assembled, we made a commitment: The people we serve will shape the product we build.

That commitment started with Bryan, who has lived these communication challenges firsthand. As a blind technologist and advocate, Bryan understood from the beginning that no solution should be built in isolation.

That’s why AlEye’s journey began by working directly with the NFB Colorado and Virginia Chapters, as well as the Colorado Center for the Blind. These early partners weren’t just testers—they were co-designers.

With Bryan leading many of these sessions, we showed up not to pitch, but to listen:

  • What’s missing from other wearables?

  • What moments in daily life feel inaccessible?

  • How should AlEye make you feel when you use it?

This direct, personal engagement—shaped by Bryan’s advocacy and engineering perspective—guided our earliest decisions and continues to influence every update we make.

⚡ Real-Time Communication: A Core Feature

From day one, users told us:

“I need to know the moment someone goes for a handshake—not five seconds later.”

That’s why real-time feedback is a cornerstone of AlEye.

When a nonverbal cue occurs—like a smile, a nod, or someone turning away—AlEye communicates it in just 0.2 seconds. That’s faster than any smart glasses or wearable processor we’ve seen to date.

Real-time also means real autonomy. In our user testing, people were able to:

  • Hold conversations while identifying nonverbal cues.

  • Interpret haptic patterns without distraction.

  • Avoid the overload that comes with audio.

🧠 Ease of Use, Proven by Data

  • 46% of users recognized haptic cues on their first try—with no training.

  • After just 2 minutes, neurodiverse users doubled their ability to recognize nonverbal cues.

  • The experience was described as “restoring a once long-lost depth of communication.”

    This is what accessibility should feel like: natural, fast, empowering.

Want to see AlEye in action? Join us June 26th at 3:00 PM MT for a live webinar where we’ll demo the technology, share real user stories, and answer your questions.