We were inspired by a video of a legally blind woman, using a washroom in the Birmingham International Airport, who was excited by a device installed near the door which provided an audio description of the room to improve navigation for the visually impaired. We wondered whether such technology could be made more commonplace to improve accessibility, decrease risk of accidents and increase the independence of our users. This is what the app aims to do.

What it does

navEYEgate provides audio descriptions of bathrooms after a QR code on the back of the door is scanned to help those who are visually impaired in day to day life. The audio description will be recorded by a worker or volunteer either from an in person visit or from pictures of the bathroom. These will then be stored into a database and will be retrieved with the QR code. The recording can be paused and the QR code easily scanned again if a replay is needed.

How we built it

We used Android Studio, for developing the app, using Java. The app was built on our computers and tested on one of our phones, which was an android.

Challenges we ran into

The team learnt how to develop a Mobile App from scratch. After the initial learning, we were faced with a lot of challenges and bugs. The QR code scanner and Media Player took some time to debug, but ultimately we could reach a point where they worked. We were also helped by one of our fellow Wackathon participants who gave us a cable to connect our phones to our computers to run the app.

Accomplishments that we're proud of

The group had never worked with Android Studio to develop a Mobile Application before. We are proud of the fact that we could all learn a new skill in such a short duration of time, to develop a working product.

What we learned

During our market research, we learnt about the often missed struggles of visually impaired and blind people. We also learnt about the importance of making society more accessible for all and the importance of accommodating for everyone’s needs which is becoming more possible as cities get smarter and people are connected to the internet. Programmatically, we learnt how to link to a QR library and use it in AndroidStudio to build an app as well as how to connect debugging to a phone

What's next for navEYEgate

The app currently only works on Android, a way to expand its user-base would be to develop a version compatible with iOS, so that iPhone users can use it. The Audio descriptions are currently all human generated. In the future we can harness the power of AI to develop descriptions of rooms from pictures provided. There is a lot more can be done with this app and we are excited to think about the future possibilities.

Built With

Share this project: