What if you could see how everyone else felt?

e-motes connects to your webcam and uses facial recognition to read a person's face for emotions and then visualizes the data. We were inspired by facial recognition technology and beautiful computer generated data visualizations.

How we built it

We used HTML/CSS/JavaScript to build the website and communicate with the server, and we used Python-Flask to aggregate the data and send it back. We modified existing facial recognition code to collect data and modified an audio visualizer to receive our data.

Challenges faced, and Lessons learned

Our team ran into several difficulties when making e-mote. Unfamiliarity with languages of HTML, CSS, JavaScript, and Flask were the primary hurdle when setting up sockets and coding the website. We also had trouble translating four emotions into a color system. Ultimately we learned more about using GitHub and coding vocabulary. Our efforts resulted in a website with a functioning visualizer with a page that uses the webcam to aggregate data.

What's next for e-mote?

We would ideally like to refine the color changing within our visualizer as well as adding different channels to be to compare different populations with their facial expressions. Additionally, other visualizations would be created as well, allowing a user to toggle between different forms of data visualization, especially one that would be able to isolate an individual from the collective visual.

Share this project:

Updates