Inspiration
After learning about IBM Watson's visual recognition feature, we thought it was a super cool technology to play around with. However, we had a lot of trouble thinking about useful ways to use it to create value mainly since anything Watson could do, our eyes could do the same. This led us to thinking about those who weren't as fortunate as most people. the visually impaired. Using IBM Watson's visual recognition, people who have difficulty or cannot see, can independently identify their surrounding using a smart phone.
What it does
Watson Front of Me? is a simple mobile application available on both iOS and Android which easily takes a picture of what is currently in front and talks to the user about what it sees. Watson will mention what objects and environment it has detected and mention how confident it is when it sees it.
How we built it
We built the app to run on both iOS and Android in react-native since Expo.io allows creation of apps in both platforms. To connect to Watson, we used IBM's Watson Developer API to send POST requests and upload images to Watson for it to analyze. Once the endpoint returns a response, we use a Text-To-Speech library to speak out towards the user and communicates the information.
Challenges we ran into
Our entire team was composed of beginners regarding the react framework and we had a lot of trouble sending requests to the API. We also had some roadblocks with the slow internet speed since Expo.io heavily relies on the local network to compile applications, which slowed us down considerably when testing our code and trying new features.
Accomplishments that we're proud of
We're really happy to have completed the main feature of our application which is having Watson recognize and communicate with the user on a basic level. Even though we had plenty more features planned, we got what we wanted as a proof of concept.
What we learned
We learned that we are extremely fortunate to not have to rely on others to guide us, and that for those who do have vision impairments, technology is well on its way to help them be more independent and live more comfortably.
What's next for Watson Front of Me
We would definitely improve on the user interface by making the main button larger, more vibrations from the phone and possibly voice commands. In addition, Watson should be able to give more specific and situational messages relating to what it may see such as warning the user of dangers or giving more details about the surroundings.
Log in or sign up for Devpost to join the conversation.