Inspiration
Millions of people worldwide live with visual or hearing impairments, yet most assistive navigation technology either requires expensive specialized hardware or relies on senses the user may not have. We wanted to build something wearable, affordable, and practical — a device that translates the physical world into signals the user can actually perceive, whether that's sound or touch.
What it does
Echo has two components designed for different levels of impairment.
For blind users, a motorized camera rotates to scan the environment, detects obstacles using computer vision, and alerts the user through audio feedback on a companion app — describing what is ahead and how close it is.
For deafblind users, a Viam robotic arm mounted on a wearable harness communicates entirely through touch. The arm taps the user's left shoulder to signal a left turn, taps the right shoulder for a right turn, and opens its claw proportionally as an obstacle approaches — acting as a physical proximity gauge. Once an obstacle is cleared, the system automatically guides the user back to their original direction by reversing the turns taken.
How we built it
- Robotic arm: Viam robot arm controlled via a Waveshare Bus Servo Adapter (A), connected to a Raspberry Pi over USB
- Obstacle detection: Depth camera mounted on the arm, splitting the field of view into left, center, and right zones to determine the clearest path
- Navigation logic: Python script running on the Raspberry Pi using the Viam SDK, implementing a turn-debt system to track detours and return the user to their original heading without any compass or IMU
- Audio component: Companion mobile app receiving object detection alerts from the camera in real time
Challenges we ran into
The biggest challenge was determining user intent — specifically, how do you know which direction someone wants to go without GPS, maps, or vision? We solved this by locking the user's heading at startup as their "origin direction" and treating all subsequent movement as relative to that. We also had to replace compass-based heading tracking with a simpler turn-counting system after realizing we had no IMU available, which actually produced cleaner and more reliable results.
Accomplishments that we're proud of
- A fully offline, wearable navigation system that works anywhere without internet or pre-mapped environments
- A haptic feedback language using only three signals (left tap, right tap, claw openness) that is intuitive enough to require no training
- A turn-debt algorithm that reliably returns users to their original path after any number of obstacle detours, with no sensors beyond the depth camera
What we learned
- Robotic haptic feedback is a surprisingly expressive communication channel — touch can convey both direction and urgency simultaneously
- The hardest problem in assistive navigation is not obstacle detection, but intent detection — knowing where the user wants to go
- Keeping the system fully offline and self-contained is essential for safety-critical wearable devices
What's next for Echo
- Adding GPS for outdoor navigation with turn-by-turn directions to a spoken destination
- Finer gripper control for more granular proximity feedback
- A lightweight SLAM implementation so the device can remember and navigate familiar indoor spaces
- Ruggedized harness design for everyday wearability
Log in or sign up for Devpost to join the conversation.