As libraries and API's become increasingly powerful we wanted to contribute to those with visual impairments to understand their surroundings

What it does

By pressing anywhere on the screen, the app takes a picture and dictates single words to let the user know what is near them.

What I learned

We learned to design a product for a market that does not include either of us. This forced us to think from another's perspective.

What's next for Project Window

We are looking forward to submitting the app in the Apple App Store!

Share this project: