Life is complicated. So, we thought, why not make it a bit simpler. Instead of having to think about how different interactions in day-to-day life went, why not give you the easy way out by handing you an analysis of your emotional state throughout.

What it does

It analyzes video recordings and is able to give an analysis on your motions. It presents the results graphically allowing to look at your mental state.

How we built it

My team and I used Python mainly. We set up the server using Flask. The front end has bee implemented in HTML, CSS and Javascript.

Challenges we ran into

There was a very big issue that we were facing with trying to have blobs persist in Chrome. This was for saving the recorded video. Another big issue that we were facing was figuring out how to get the best possible results from Clarifai's API.

Accomplishments that we're proud of

What we learned

We learnt a lot about video encoding and codecs, how to store objects in browser memory, facial recognition, coooler front-end development and finally how to build a fully functional web app. We also gained a lot of insightful information into how Clarifai works. It is an amazing API that helps in implementing Computer Vision learning models easily.

What's next for EmoMe

We hope to make EmoMe into a fully-functioning app that would have the ability to people gauge video interactions. We plan to include audio transcript so that we can link emotions to the text which people spoke.

Share this project: