EyeSeeYou's vision right from the beginning was to build an intelligent assistant for the visually impaired that would improve the quality of their life by enabling them to sense their surroundings via sound.

What it does

We have taken the limitations/ restrictions of a visually impaired person into consideration and have made our UI as easy to use and friendly as possible. Firstly, we are aware that a visually impaired person may not be able to launch our app due to obvious reasons. So we have automated this by using shake motion as a trigger to launch our app. Once the app is open, there are four features/options at the user's disposal. These options can be activated by actions such as short press, long press.

1. Know your surroundings This enables the user to be aware of their surrounding. when this feature is activated, camera is launched asynchronously. All the user has to do is to point and click a photograph. We have used clarifai API to process the image and present the user with keywords describing the scene in front of him/her. This information is delivered via sound. This feature can also be used to read text from say newspapers, billboard, signs etc. This text is also converted and presented to user as sound.

2. Locate me This feature enables the visually impaired person to be location aware. On activating this option, the user is given his/her location accompanied with additional information such as nearby places. We have further extended this by providing an alert feature. If the user feels that he/she is lost and needs help, then a call and distress message is sent to their emergency contact.

We have attached images depicting each of the aforementioned functionality.

How we built it

We have built an android app to translate our vision to reality. The following APIs have been used: clarifai- image recognition google vision- text recognition google places- nearby places google location- get current location volley- REST API text to speech conversion

Challenges we ran into

Training clarifai.

Accomplishments that we're proud of

We have managed to build a fully automated application. The visually impaired person would require no external assistance to operate our app. We have kept the UI and functionality very simple, clean and easy to use.

What we learned

None of us had worked with image processing or built any application along the same lines. We learnt a lot of new things.

What's next for EyeSeeYou: Smartphone Assistant for Visually Impaired

Integration of firebase in order to send location and other essential details to emergency contact. conversion of text to speech in multiple languages using translate api.

Share this project: