- HapWares's Newsletter
- Posts
- Behind the Build: How We're Engineering AlEye
Behind the Build: How We're Engineering AlEye
Built for clarity, connection, and autonomy—see it live at our June 26th webinar.
At HapWare, We Believe Access to Information Is a Right—Not a Luxury
Communication is more than just words. In fact, studies show that up to 90% of communication is nonverbal, that includes facial expressions, body language, and subtle gestures.
We built AlEye to be ultimate tool for communication, clarity, and connection.
Whether it’s a furrowed brow, a smile, a gesture of disinterest, or someone simply walking away, AlEye gives users the autonomy to detect, decode, and respond—in real time. To see AlEye in action attend our virtual webinar on June 26th at 3:00 PM MT by registering here. Spots are filling up fast!
Detect. Decode. Deliver.
Detect
The AlEye smart glasses are equipped with an integrated high-resolution camera (66° field of view, 5MP—soon to be upgraded to up to 12MP). It scans the environment to detect visual nonverbal cues like:
Facial expressions (e.g., smiles, frowns, raised eyebrows)
Gestures (e.g., waves, thumbs-up, crossing arms)
Movement and context (e.g., someone leaving the conversation, nodding in agreement)
Decode
Our proprietary computer vision models analyze hundreds of facial and body landmarks in real time to determine the most relevant nonverbal cues.
Works in everyday environments—indoors, outdoors, low light, crowds, and even on video calls
Accurate across various race, genders, and behaviors to reduce bias and improve accuracy for all users
Continuously improving through real-world testing and user feedback. AlEye evolves overtime, we will be adding new nonverbal cues every month!
Deliver
Once decoded, AlEye translates those cues into intuitive haptic signals on a custom wristband. With over 142 million unique vibration combinations, users receive the most natural and distinguishable feedback possible.
95% of users learn 7 cues in under 2 minutes
46% of users recognize patterns on first use—no training required
The underlying haptic design is backed by peer-reviewed research conducted by our CTO, Dr. Bryan Duarte
AlEye creates a new language of touch—one that’s fast, accessible, and doesn’t interfere with hearing or sight.
Frequently Asked Questions
How can I become a beta tester?
The best way to become a beta tester is by filling out the AlEye Beta Testing Intent Form. As a beta tester, you’ll receive a direct feedback line to the founders, be recognized as a Founding AlEye, and play a key role in shaping the next generation of communication technology. Spots are limited and filling fast—apply today!
How does AlEye perform in group settings?
We are working to validate AlEye in group settings. Our goal is for it to be built for dynamic, multi-person environments that can detect and distinguish between different people in a group, so you can stay present and informed.
How long does it take to learn?
Most users are able to understand 7 new cues within just 90 seconds. A built-in Training Mode on the phone allows users to practice each cue at their own pace before switching to real-time use.
Why haptics?
Haptics offer autonomy and accessibility. Autonomy – Haptics empower users to perceive nonverbal cues passively and in real time. Just how sighted individuals' eyes are constantly collecting and analyzing information, AlEye does the same and works for you. Haptic feedback is unobtrusive and doesn’t interrupt conversations or interactions the way audio cues often do. Accessibility – Backed by peer reviewed research, haptic feedback forms an intuitive, low-effort communication channel that can be quickly learned and subconsciously processed, reducing cognitive load and enabling fast, reliable interpretation of information in social, professional, or educational environments. You are communicated each nonverbal cue as it happens.
What is the battery life?
Battery life is currently 3.5 hours of continuous use. We are actively working on enhancements to extend runtime for all-day wear.
📢 HapWare AlEye Webinar 🚀 | Seats Are Filling Up Fast!
Don’t miss your chance to see AlEye in action—join our virtual webinar on June 26th at 3:00 PM MT. Register here to secure your spot!