What it does

SiteSee allows those who are blind or visually impaired to understand the world around them by reading-aloud relevant labels for the scene in front of them upon tap. At a Hackathon, for instance, if you tap on SiteSee then it might say "There is crowd, indoor, and humans."

How We built it

We built SiteSee using IBM Bluemix's Visual Recognition API as well as Text-to-speech and Camera functionalities on the Android mobile device. After capturing an image, Bluemix's Visual Recognition technology generates a set of labels that describe the image, which are read aloud by the Android Text-to-speech functionality.

Challenges We ran into

To ensure the app runs in an intuitive way for the visually-impaired, we worked directly with Android's camera API to simplify the picture-taking process, and this was challenging due to the complex behaviour of Android components. Another challenge we faced was in the integration of Text-to-speech, writing to external storage functionalities with the custom camera app we created.

Moreover, thinking in the shoes of a visually impaired person made us more aware of the problems they face, and how we should best design our app to be effectively used. For example, we implemented the initial screen so that the user can tap anywhere to capture an image.

Accomplishments that We're proud of

This is one of the first Android applications we have developed and we are proud that we can contribute to social change and were able to overcome most of our challenges.

What's next for SiteSee

We're looking to expand on SiteSee to create a suite of functionalities that help the blind/visually-impaired to navigate the world around them.

Share this project: