Inspiration

Promoting diversity and social change in a technological world where the visually impaired are often left unnoticed.

How does the app work?

Expanding on the technology of Google Maps, eyeBuddy will navigate you where Google left off, which is at the entrance of the building. eyeBuddy is able to navigate users through audio cues, such as "turn left in three steps." The app uses the floor maps of campus buildings to gather information for its users. The user will be able to activate the app through Siri and tap anywhere on the screen to start it up.

How we built it

We built it using the Google Web Speech API, Bootstrap, Sketch, Xcode, sublime-text

Challenges we ran into

  • Xcode simulator wouldn't deploy
  • Using Google Cloud Vision API
  • Using Javascript
  • Using Google Speech to Text and Text to Speech API

Accomplishments that we're proud of

  • Building a web platform that automates text to speech when opened.

What we learned

  • Learned more about Javascript
  • Learned more about Xcode
  • Learned more about the Google Map API
  • Learned more about the Google Cloud Vision API
  • Learned more about the Google Web Speech API

Who can use the app?

While the app was created for people that are visually impaired, those who might be socially anxious and are uncomfortable with asking for directions can benefit from the app as well. Other applications that are targetted to those visually impaired, don't offer indoor navigation and require volunteer help. eyeBuddy allows you to travel at your own time and with independence.

Where do we go from here?

It is our goal to develop this app into something bigger that could be used in all CUNY campuses and then later even into New York public transit.

Share this project:
×

Updates