We were inspired by the challenge: Mirum and JWT: Can a Computer Hear How You Feel? Seeing the emotion in IM and voice.

What it does

By analyzing IM and voice, EmoBot will be able to score and visualize the emotional tone of a conversation.

How we built it

With love and a dash unicorn dust... jk,

We built the back-end in Python. We researched API’s that analyzed text for sentiment, and used the IBM Watson Tone Analyzer API which assigns an emotion to input text. Our input text comes from a text file which we read. The Tone Analyzer assigns either anger, joy, disgust, sadness, or fear. It also tells us if the text was analytical, tentative, or confident. The API also returned a score between -1 and 1 for each emotional state to indicate magnitude. We also used a natural language API from Google to detect if a phrase is positive or negative. Google also returned a score between -1 and 1. If the score is above 0, it indicates a positive emotion. If the score is negative, that indicates a negative emotion. In python, we have our program read a text file and analyze it through both the IBM and Google APIs. We then compared the results from both Google and IBM. We used Google API as a backup to verify our results from the IBM API. If the APIs returned nothing for the outputs, we assumed that the phrases were emotionless and assigned it as “neutral”. Our program outputs the emotion associated with the text along with the score.

Challenges we ran into

One of the challenges we faced was implementing the API into our program. We were unfamiliar with using these API and getting it to work in Python; this took some time and research. Trying to extract the correct output emotions and logistically inputting text into the API was challenging. We also needed to create a server through Flask to connect the back-end to front-end which was challenging at first because it was a new topic.

Accomplishments that we're proud of

We made EmoBot work! And we all slept at some point during this weekend.

What we learned

Human Emotions are complicated... very complicated.

What's next for EmoBot

In a big picture, EmoBot can be further developed and implemented into fields like commercial advertisement, IoT, & daily-care etc.

More specifically, we are still in the process of improving the accuracy of the evaluation. Right now there are emotions that we are pretty confident about, but there are also a couple of feelings that are more difficult to detect. Solving issues like this can improve general user experience of EmoBot.

Share this project: