Working with people with autism, a lot of times they struggle understanding the emotions that are being portrayed by other people in a conversation.

What it does

Using a pebble watch, we are able to detect conversations, convert them to text and send them to IBM Watson, which will interpret the content and will gives us a range of emotions that can be inferred from the conversation.

How we built it

We used CloudPebble and JavaScript to record and convert from Speech-to-Text. Then we created an API that uses IBM Watson to gives us the tone of the conversation recorded. After that, we converted that tone to a color that is displayed on the pebble watch.

Challenges we ran into

Using CloudPebble IDE we first used some open source code in C, but we were not able to figure out how to make API calls so we pivoted to JavaScript whose default set-up was wrong, as a result we had to figure out everything from scratch. Another challenge that we overcame was to run a python server in node.

What we learned

Each of us learned new skills from each other including audio file in JavaScript, Python in Node, IBM Watson’s API and a lot more. We explored Machine Learning through IBM Watson's: Natural Language Analysis, Tone Analysis, Speech to Text. This allowed us to empathize with our target demographic through hardware that we were not familiar with before, i.e., the Pebble Smartwatch. As a multi-stack, language team, we all shared our experiences of how to best integrate the skills we knew into our project.

What's next for u mad bro?

We intend to further explore the social nuances that autism victims experience. We would like to create an app that works on other wearable devices. Additionally, we would like to take advantage of IBM Watson's capability to scan for certain keywords, so as to be able to alert a friend or family member if the user appears to be in danger.

Share this project: