Inspiration

We were inspired by the medical condition of David's father who is blind. Our app was built to help him see the world. Our app helps visually impaired people understand what's going on around them.

What it does

Look Assistive Technology is an application that describes its surrounding environment via audio to assist visually impaired individuals. The application is controlled through voice commands or simple whole page buttons. Our customers can use Look when crossing the road, looking for an object like a bag, or keys. We use computer vision to translate camera photos into speech. We can recognize a pedestrian crosswalk and help visually impaired people translate vision into speech.

How we built it

We use Google's cloud vision API and OpenCV for computer vision and TensorFlow for Machine Intelligence running on AWS.

Challenges we ran into

Android Studio build problems.

Accomplishments that we're proud of

We are proud of a product that has real potential to do positive change, and help the visually impaired.

What we learned

We learned a few new technologies and algorithms, including image recognition, machine learning, Google cloud platform API, and android app development.

What's next for Look

The app would allow users to request their location, walking directions to any spoken address, and facial recognition. We want to use this technology for self driving cars and autonomous robots.

Built With

Share this project:
×

Updates