There are not many products that allow blind people to perceive the world around them. For them, their lives are ruled by the sounds they hear, while for many of us, it is what we see. That's why we brought SmartScope. The idea behind SmartScope is that a blind person (or anyone who wishes to use the application), will turn on the application. A camera will take a photo every 5 seconds or so, and will send the information to IBM Bluemix. Using their Visual Analytics Watson, we convert this image into a parsable JSON. We then take this JSON, and put that through another Watson, the Text to Speech, which converts the JSON into speech. Through this, anyone who uses the application will perceive the world around them through simply auditory statements.

Share this project: