Inspiration

Current digital navigation tools for visually impaired people rely on audio instructions. This may be impractical in loud cities, and the audio is a distraction, decreasing the user's situational awareness and making navigation less safe. Haps Vision is a system that turns digital directions into haptic feedback, allowing the user to navigate without relying on sight or sound.

What it does

Haps Vision is a wristband with 3 haptic outputs to indicate left, right, and forward. The device pairs with the user's phone over Bluetooth through a web application. On the web app, which is compatible with standard accessibility features like screen readers, the user inputs a destination and begins navigation. Through live GPS, the app senses upcoming turns and outputs vibration to the user's wrist to direct.

How we built it

  • Frontend: A web application using the Google Maps APIs for routing, run on a bluetooth-compatible browser app, Bluefy.
  • Hardware: An ESP32 microcontroller coupled with an MPU-6050 accelerometer/gyroscope for sensing motion. SG90 microservos are used for haptic output.
  • Edge AI with ESP-DL: We utilized the ESP-DL library to deploy a motion-recognition model directly on the ESP32. We enabled real-time gesture detection with minimal latency and power consumption.

Challenges we ran into

  • Magnetometer-less Orientation: Being restricted to an MPU-6050 (no magnetometer) meant we couldn't rely on a simple compass. We had to develop a relative-yaw tracking algorithm that integrates gyroscope data to sense when a user has completed a physical 90-degree turn.
  • Mobile Browser Constraints: Web Bluetooth is strictly regulated on iOS. We had to navigate the using the Bluefy browser and handle asynchronous permission requests for both Bluetooth and Device Orientation.

Accomplishments that we're proud of

  • ESP-DL Implementation: Successfully deploying an AI model using the ESP-DL library to process complex motion data at the edge.
  • Smart Calibration: We developed a "Start-of-Path" calibration that ensures the user is oriented correctly before they even take their first step.

What we learned

  • ESP32 Ecosystem: Gained deep experience with the ESP32-S series and optimized deep learning libraries.
  • Optimized AI at the Edge: One of our biggest takeaways was learning to use ESP-D*. We learned how to optimize deep learning models specifically for implementations that can run on a microcontroller with limited RAM.
  • Web Bluetooth GATT Services: Learned how to build a bidirectional communication bridge between a browser and a custom hardware peripheral.

What's next for Haps Vision

  • Miniaturization: Moving from a breadboard prototype to a custom PCB and an ergonomic enclosure.
  • Sensor Fusion: Integrating a dedicated magnetometer to provide absolute North-referenced heading.
  • Customizable gestures: Integrating further features with more gestures and customized settings.

Built With

Share this project:

Updates