Inspiration

Since our teammate Quirin's neighborhood became part of an e-scooter providers service area, his blind dad has trouble navigating around the area safely without tripping over improperly parked e-scooters. We had the idea of using data to warn the visual disabled user when he is coming too close to an e-scooter or another potentially hazardous obstacle.

What it does

The application collects data about obstacles from multiple sources (the e-scooter provider's database, the municipal government and the users themselves) and compares them to the user's location in real time. Should he or she come too close to a potentially hazardous object or situation, the application will alert the user using visual, audio (text-to-speech) and physical (vibration patterns) cues.

How we built it

Ingolstadt, our lovely home city, currently has one active provider of e-scooters for rent. We use their service API to retrieve the locations of all parked e-scooters within city limits. These locations are stored in our database, alongside other obstacles and hazardous locations that have been submitted by either government employees or the users themselves.

The app continuously determines the user's location and sends it to our backend to retrieve nearby entries. When the user enters a certain range around the reported obstacle, a warning action is triggered. The app will then display a clear visual warning, read a warning to the user using text-to-speech technology ("Caution: e-scooter ahead") and vibrate the phone to capture the users attention even in loud environments.

Challenges we ran into

  • Developing an accessible app that can be operated by visually impaired people
  • Finding the right APIs to retrieve relevant data about the environment
  • Calculating the users current position alongside the way they are heading

Accomplishments that we're proud of

  • Creating something that can help disadvantaged people live a safer life
  • Developing a fully working prototype in minimal time
  • A flexible architecture capable of collecting data from various sources
  • Having a great time socializing with new people online

What we learned

  • State of the art mobile app development technology
  • Geographical calculations
  • Prioritizing tasks and time management

What's next for Second Eye

  • We want to develop the smart white cane: A smart blind person's cane equipped with Bluetooth and a compass. The goal is to supply information (such as navigational directions or hazardous locations) to the user by vibrating his cane in recognizable patterns.
  • Integration of real time traffic light data as provided by the open data initiative of the city of Ingolstadt. The user should be warned about upcoming traffic lights and their current phase.
  • Integration of Bluetooth beacons to allow municipalities and business owners to physically mark hazardous locations. These would work even in buildings where there is no stable internet connection.

Built With

Share this project:

Updates