17% of the world population is affected by visual impairment -- that's 1.6 billion people. Many are unable to travel around independently, thus severely limiting their personal freedom. We hoped to create a product that will liberate them from the limits of their impairment and provide more options in life.

What it does

Our product haptically guides the user towards their destination by actively checking the current location, as well as the up to date path from their current location to their destination.

How we built it

Current location and path towards destination were obtained using the Google Directions API. We provided a visual of that for demo purposes. This information was then sent to the Arduino through bluetooth connections and although for the purpose of demonstration, we attached LEDs to light up in the direction of the next turn, in reality it will be motors that vibrate.

Challenges we ran into

  • Unbeknownst to us at first, a complimentary mobile app was required to operate the Adafruit Bluefruit LE Shield
  • Implementing linear algebra (vectors and matrices) to accurately calculation direction
  • Communicating between various platforms, each with different languages

Accomplishments that we're proud of

To have a functional prototype with all the core functions available during the short time period.

What we learned

  • meticulous inspection of circuitry to ensure the multitude of wires were all plugged in correctly
  • math fundamentals are crucial for accurate implementation
  • for prototyping purposes: when in doubt, duct tape!

What's next for HapticLight

  • making the actual product more convenient to wear around by using smaller, industrial-grade components
  • improve location accuracy by implementing computer vision
  • incorporate machine intelligence to provide more detailed guidance off standard roads
Share this project: