Inspiration
We aimed for a better understanding of what our emotional needs are while streaming movies or songs on platforms such as Netflix and Spotify, as we need a framework to give us suggestions based on the emotional data it receives from our current mood and past experiences. Facial expressions have always been the most accurate way to gather feedback, as opposed to asking someone to manually input their feedback, which seems like a chore (whether it's asking doubts in class or reviewing a movie). This is why through our program, we can get accurate responses in real-time!
What it does
Our product, on a technical scale, is a web application which detects a wide range of emotions from multiple people through live stream webcam using google cloud’s vision API’s machine learning model. Using the detected emotions, the web app recommends movies and songs that are closely related to the interests of the group of users.
How I built it
We extract facial landmarks and detect a variety of emotions using Google Cloud's Vision API. Then, based on algorithms we built from scratch in a python base, we suggest closely related movies/songs that the users are likely to enjoy. Through OpenCV and NumPy we obtain real-life data instantaneously. Thanks to GoogleCloud's Vision API's machine learning model we can train our datasets and obtain a much wider and much more accurate range of emotions with time.

Log in or sign up for Devpost to join the conversation.