Inspiration
There are 10 million visually-impaired citizens in the United States, or 1 out of 30 every American suffers from certain degree of visual impairment: glaucoma infection, macular degeneration, cataract, photophobia, diabetic retinopathy, retinitis pigmentosa or even fully blindness. In New York City itself, there are 800, 000 New Yorkers who suffer from visual impairment. The situation is worse for citizens of a busy, bustling urban city because of the busy roads, frequent construction sites, damaged roads, urban pedestrian flow and this group is often forgotten in urban planning.

What it does
EchoLoco helps the visually impaired to move around better in a urban city. Nearly a million of them faces difficulties moving in a hustling city like New York. EchoLoco wants to help them move around better. A city is only sustainable and livable if we leave no-one behind. EchoLoco creates connected audio maps with real time cues for the visually impaired to live smarter, safer and stronger in urban cities. Using _ Esri _ map routing services, _ Urban-X_ government data feed, _ IBM Watson Speech-to-Text_, IBM Bluemix and Harmen’s ambience awareness technology headphones, we can achieve that.

How we built it
Esri provides map overlay with real-time routing services → Urban-X provides with daily social mapping services → IBM Watson Speech-2-Text allows us to translate the text directions into audio cues → _Harman’s ambience aware technology) allow users to automatically adjust ambience awareness level via surrounding noises to maximise awareness where distinct sound cues are lacking. The app is built using Swift on iOS.

Challenges we ran into
One of the biggest challenges we faced was finding the right relevant dataset amongst the huge amount of data available; our team spent a significant amount of time determining the relevancy of the selected data to fit what we deemed as key features where the visually impaired would find important. Integrating several APIs at once is another challenge as getting all of them to work together simultaneously did not go as well as we expected.

Accomplishments that we're proud of
We are immensely proud that we are able to come up with a solution that truly would be able to help the visually impaired solve a real problem. On the technical side, we were amazed that we were able to understand and appreciate the beauty of mapping, thanks to Esri.

What we learned
Our team learnt both technical and non-technical understanding of the topic itself. In the non-technical sense, our research led us to deeper understanding of the impaired and inspired us to make the app happen to improve the lives of them. On the technical aspect, we learnt how to simultaneously sync all the APIs to make it work together accordingly to what we want.

What's next for EchoLoco
As we proceed, we want to not simply help the visually impaired but also people with other form of disabilities. For example, this app can be useful for wheelchair-bound citizens as they go about their daily lives; where accessibility of places requires them to map out their route before they head out everyday.

Built With

  • harman's-ambience-cancelling-headphones
  • ibm-watson-text-to-speech
  • ibm-watson-speech-to-text
  • esri's-map-routing-services
  • nyc-open-data
  • ios-text-to-speech
Share this project:
×

Updates