Inspiration

It's always hard for visually impaired people to know what's around them, what people are doing, what they are facing. We wanted to do something to help.

What it does

Our application has a main screen. If the user touches, it will take a photo and describe the scenario and reads it out loud. The user can then touch different part of the screen, the app will describe the small region touched and read any text in the region if any exist. If you shake your phone, you can take a new picture.

How we built it

We built a Android App and used Microsoft Cognition Service to do image caption and OCR. A short description will be generated and phone will read it through TTS.

Accomplishments that we're proud of

Our app can accurately describe the scenario in the photo like "a group of people standing in a room" or "a laptop on a desk".

Built With

Share this project:

Updates