Our friend is legally blind and with her macular degeneration her eyesight is steadily worsening. At the beginning of college she struggled to navigate her way through our large college campus and needs assistance from others in hectic public places. We wanted to create an application that would help her and others with vision impairments navigate through unfamiliar environments with ease.
What it does
EyeSee is an android application that helps to describe it's surrounding environment via audio to assist blind individuals. The application is controlled through voice commands or simple whole page buttons to accommodate for users who cannot find smaller details on a page. The app allows the user to request their location, walking directions to any spoken address or business, the identity of an object in a picture and the audio of any text in live video feed.
How I built it
Multiple Google APIs
Challenges I ran into
Android Studio build problems, dependency errors
Accomplishments that I'm proud of
Our product is completely functional and easily usable by people with visual impairments.
What I learned
This was our first time really working with Android Studio and we are now very familiar with it.
What's next for EyeSee
We would like to implement Natural Language Processing in order to handle a larger vareity of commands.