We wanted to create something good for people with visual disabilities(or just very bad at taking pictures) so we came up with a app that helped blind people take pictures.

What it does

The app provides audio tips to the user about the optimal angle, distance and height to take a good picture. It also uses Nuance's natural language processing to understand the user's intentions and make the navigation process more seamless. Also used Google's TTS and STT to make the audio work.

How we built it

We built it using the standard android-sdk using java as a programming language.

Challenges we ran into

We had a lot of trouble with the Microsoft Azure platform, setting it up and getting the pictures to correctly store on their blob services and then using it to do sentiment analysis on it.

Accomplishments that we're proud of

Being able to complete a project :)

What we learned

Working with Azure is hard

What's next for blindSpot

Adding functionality such as voice recording for images.

Share this project:

Updates