Inspiration
Roughly 285 million people worldwide live with visual impairment, yet most assistive tech focuses on outdoor GPS navigation. Indoor environments — clinics, schools, offices, even one's own home — remain stubbornly difficult to navigate independently. White canes and guide dogs help avoid obstacles, but they can't tell you where you are or how to get to the conference room three turns away. We wanted to bridge that gap with hardware light enough to wear all day and a workflow simple enough that a caretaker can set it up once.
What We Built
WAYPOINT is a wearable indoor navigation system with three pieces of hardware orchestrated by an iPhone hub:
- Smart glasses (ESP32-CAM) capture visual fiducial markers for position correction
- Waistband ultrasonic sensor detects temporary obstacles not stored in the saved map
- Two haptic wristbands (Lolin32 Lite + vibration motors) deliver directional cues — buzz left, buzz right
The iPhone runs the brain: an ARKit-generated occupancy grid of the pre-mapped space, A* pathfinding to compute routes, and ArUco fiducial markers placed at fixed reference points to correct IMU drift over extended use.
How It Works
A caretaker or user pre-walks the space once to generate the saved map. After that, WAYPOINT routes the user from their current position to any saved destination ("kitchen," "exam room 3," "conference room") via voice command.
Pathfinding uses A* with Manhattan distance as the heuristic:
$$f(n) = g(n) + h(n), \quad h(n) = |x_1 - x_2| + |y_1 - y_2|$$
The waistband ultrasonic sensor computes obstacle distance from the round-trip pulse time:
$$d = \frac{v \cdot t}{2}, \quad v \approx 343 \text{ m/s}$$
When an unexpected object — a wet-floor sign, a misplaced chair — appears in the user's path, the waistband triggers a haptic alert, and the system reroutes around it. Crucially, transient obstacles are not written back to the saved map; they're treated as alerts so stale data never accumulates.
What We Learned
- Hardware is humbling. Datasheets lie, JST connectors ship in mirrored polarities, and "drop-in replacement" rarely is.
- Less hardware is more product. We started with a custom smart cane and dropped it — most blind users already trust the cane they have. Moving the obstacle sensor to the waistband cut our BOM and made WAYPOINT additive to existing tools rather than a replacement for them.
- ARKit drifts. Without ArUco anchors, position error compounds over a few minutes of walking. Fiducial markers at known coordinates fixed it cleanly.
- Haptic vocabulary matters. Three intuitive buzz patterns beat ten clever ones every time.
Challenges We Faced
- LiPo power on the Lolin32 Lite. AITRIP boards expected reversed JST polarity from what our batteries shipped with — a silent failure mode that took hours to isolate.
- Coordinating four wireless boards. Keeping the glasses, waistband, and two wristbands in sync over ESP-NOW without overwhelming the iPhone hub required several protocol rewrites.
- Designing for users we aren't. None of us are visually impaired, so every UX decision had to be checked against research and existing assistive-tech conventions rather than personal intuition.
Log in or sign up for Devpost to join the conversation.