20.6 million Americans 18 and older reported vision loss.
  Vision problems are among the top 10 disabilities as reported by the American Foundation for the Blind.

 Worldwide, the World Health Organization estimate 285 million people are visually impaired.

  Think about the family or friends close to you with visual impairment. We want to keep them out of danger while avoiding drawing attention to their disability. 

What it does

Google Tango knows where it is and how it is moving through space. 
Tango Vision guides users past close objects within 1.5 meters with turning, warning and vibration cues.

How I built it

   We explored the Tango API and its libraries and utilized them in conjunction with Android Studio to create the app.

Challenges We ran into

   Memory issues where the Tango was storing data locally and had to be restarted every ~20 debug cycles.     

   Instabilities in the API and Tango device: 


 We wished to use Project Tango by Google to address a social need. We implemented an app to help the visually impaired to navigate buildings and avoid obstacles implementing haptic and audio feedback.

What I learned

What's next for Tango Vision

During the Hackathon we got to work with the tablet, but we would like to use the phone armband with the smaller phone sensing device to make Tango Vision more portable.

Tango Vision would expand to the many environmental factors as "Self-Driving People" to take human error out of the equation. 
Share this project: