1. The Inspiration: Breaking the Wall of Silence

We built a wearable assistive system designed to improve both navigation and communication for deafblind individuals. Our goal was to create a low-cost, portable solution that enables users to better understand their surroundings and interact with others without relying on vision or hearing.

The system uses a Raspberry Pi connected to a camera, microphone, speaker, vibration motors, and a single input button. Using OpenCV, the camera continuously analyzes the environment in real time to detect obstacles and distinguish between objects and people. This information is translated into directional haptic feedback: one of the vibration motors placed just above each shoulder indicates the type of obstacle based on duration of the pulsed vibration. Also, as an object gets closer, the pulse frequency increases rapidly.

For communication, we implemented a bidirectional Morse-based interface that works entirely through touch and audio. Incoming speech is captured through a USB microphone and processed using SpeechRecognition APIs, then converted into text. That text is translated into Morse code and delivered to the user through structured vibration patterns above one shoulder. To respond, the user uses a single button to input Morse code via short and long presses. The system decodes this input into text and outputs it through a speaker using text-to-speech, enabling real-time conversation.

The software is built in Python using a modular architecture, separating computer vision, motor control, speech processing, and input handling into distinct components. The system operates in two primary states—navigation mode and conversation mode—allowing seamless switching between environmental awareness and communication. We focused heavily on real-time performance, input reliability, and usability, tuning thresholds for speech detection, Morse timing, and object sensitivity to create a smooth user experience.

Overall, this project combines hardware and software into a unified assistive device that addresses two major challenges faced by the deafblind community: safe navigation and independent communication. Our approach emphasizes accessibility, simplicity, and scalability, with potential future improvements including more advanced detection models, offline speech processing, and a more refined wearable form factor.

Built With

Share this project:

Updates