Growing up watching Iron Man, we were always captivated by J.A.R.V.I.S.—not just because he was smart, but because he gave Tony Stark complete agency over his world. That movie magic made us ask a difficult question: What about the people who need that agency the most? For the deafblind community—people who cannot see, hear, or speak—the digital world is often a closed door. We wanted to build a key.

We set out to create a "Tactile Jarvis," an AI assistant that doesn't rely on screens or speakers, but on touch. The idea was to replace missing senses with technology: a camera that acts as eyes, a microphone that acts as ears, and a haptic engine that acts as a voice. We built a system where the AI observes the world—reading signs, identifying faces, or listening for alarms—and translates that chaotic information into precise vibrations and Braille patterns on the user's hand. It allows a user to "feel" a conversation or "touch" a visual scene.

It wasn't an easy build. We spent sleepless nights fighting with latency, trying to figure out how to make an AI describe a sunset or a smile using only vibrations. We had to teach the computer that "seeing" isn't just about identifying objects; it's about context. The breakthrough moment came when we successfully translated a spoken "Hello" into a gentle pulse on the finger, and a hand gesture back into spoken words.

This project isn't just about code or hardware; it’s about connection. We realized that while Tony Stark used Jarvis to fly, our users could use this technology to do something even more powerful: simply connect with the person standing next to them. We are just getting started, but we believe this is the first step toward a future where no one is isolated by their biology.

Built With

Share this project:

Updates