Inspiration

We wanted to bring some hardware into the frame due to our EEE backgrounds, so we decided to lightup LEDs in response to the output of the Emotion API. We were inspired by the video shown in the opening Microsoft talk displaying just how powerful the software is.

What it does

We use opencv to open the webcam and capture a frame. This is then encoded as a jpeg, and sent as a byte array via an octet-stream to the microsoft Emotion API servers. This returns a JSON file with each detected face, and the likelihoods for each of the 8 emotions it can detect; anger, contempt, disgust, fear, happiness, neutral, sadness, and surprised. This is then encoded and sent via serial to an arduino, which lights up the LED strip the appropriate color and brightness for your mood.

How we built it

Challenges we ran into

Decoding JSON files Encoding images captured by opencv

Accomplishments that we're proud of

It worked.

What we learned

What's next for Chameleon

A custom PCB to avoid the bulkiness of the arduino, a display naming the displayed emotion and the ability to survey an image or video of multiple people and take the average of their emotions. Currently we display the most dominant emotion, with brightness proportional to how strong the emotion is, so we could take an average of the top few emotions and mix colours to show the range of emotions in a small crowd.

Built With

Share this project:

Updates