EchoBand — Silent Gesture Translator for the Deaf Community
Inspiration
The American Sign Language was created to assist those who can't communicate verbally. However, it is not at all common for someone who can communicate verbally to know ASL. More than 430 million people worldwide live with disabling hearing loss — many rely on sign language as their primary communication method, but most people don't understand it, leading to daily communication barriers.
The idea of EchoBand came from the realization that ASL only works between those who already know ASL. We wanted to bridge that gap through EchoBand so that anyone would be able to communicate common phrases and words with simple video-game like combinations — a wearable gesture translator that reads wrist and hand movements and converts them into text or spoken words in real time.
What It Does
The EchoBand uses sensors, a microprocessor and other components to detect and track the movement of the board in the hand for its user. It uses an MPU9250 motion sensor to translate hand gestures into words.
When the wearer performs specific movement sequences (like "up–up–up"), the OLED display shows the movement in the form of text. If a specific sequence of movements is performed, it displays specific messages such as "HELLO" or "BYE" based on a collection of preset movement sequences and corresponding messages, which can be changed and customized by the user.
Core Features
- Detects 3D motion patterns (accelerometer + gyroscope)
- Translates gesture sequences into meaningful words
- Displays translations instantly on an OLED screen
- Video-game like movement combinations for easy learning
- Customizable gesture sequences and messages
- (Future) Speaks phrases via text-to-speech module
- Built with rented parts
Hardware
The EchoBand was designed to be low-cost and easy to manufacture. This prototype is made from relatively few main components:
- ESP32 Processor — Main microcontroller
- MPU9250 — 9-axis IMU sensor (Accel + Gyro + Mag) for motion tracking
- OLED Display (128x64 pixels) — For displaying text and messages
- LiPo battery (3.3V )— For portable power (didn't use in demonstration but is compatible)
- Breadboard — For prototyping connections
- Jumper wires — For component connections
Software
Once the board was assembled, we utilized Arduino to create the code and instructions for the ESP32, which was then uploaded to the system. Once the upload was complete, the user could begin moving the band, tracking their movements and displaying corresponding messages.
- Language: C++ (Arduino Framework)
- Libraries:
MPU9250_asukiaaaAdafruit_SSD1306
- Algorithms for:
- Motion filtering and calibration
- Gesture detection & debouncing
- Sequence matching for word recognition
- User customization of gesture patterns
Algorithm Overview
- Read raw accelerometer and gyroscope data
- Apply offset calibration and normalization
- Detect basic gestures: UP, DOWN, LEFT, RIGHT
- Combine 3-gesture sequences (e.g., \( \text{UP} \rightarrow \text{DOWN} \rightarrow \text{LEFT} \))
- Match patterns to a stored word dictionary
- Display or speak the recognized word
Example Patterns
| Gesture Sequence | Translation |
|---|---|
| UP–UP–UP | HELLO |
| DOWN–DOWN–DOWN | BYE |
| LEFT–LEFT–LEFT | THANK YOU |
| RIGHT–RIGHT–RIGHT | SORRY |
| UP–DOWN–LEFT | HOWDY |
| DOWN–DOWN–RIGHT | PERFECT |
| LEFT–DOWN–RIGHT | YES |
Challenges We Ran Into
One of the main challenges we ran into was sequencing the band and sensors with the correct text output. Specific challenges included:
- Calibrating IMU data to reduce jitter and noise
- Taking the complex, rapid, and jarring motion of the user's hand and programming the system to accurately detect and track such abstract motion with a system so small and compact
- Designing a reliable debounce system for gesture detection
- Ensuring gesture recognition consistency across users
- Requiring us to recalibrate the band several times and constantly update and fine-tune our instructions for the board's processor in Arduino
- Optimizing for limited ESP32 memory and processing power
- Creating intuitive gesture patterns that are easy to remember
Accomplishments We're Proud Of
We are proud of how far we were able to progress after starting off with no knowledge. Being so new to this, we had no idea where to even begin because we didn't even know how breadboarding worked. Our major accomplishments include:
- Successfully connected an MPU to an ESP32 and learned breadboarding from scratch
- Used code to communicate between the two components
- Displayed the outcome on an OLED display
- Built a fully functional gesture-to-text translator
- Created a responsive OLED display interface
- Implemented accurate motion recognition in real time
- Developed an affordable, wearable prototype with real accessibility impact
What We Learned
- The importance of data calibration in motion sensing
- How to work with breadboards and hardware components from the ground up
- Balancing performance and memory on microcontrollers
- The iterative process of calibration and fine-tuning for motion tracking systems
Log in or sign up for Devpost to join the conversation.