Some times it is hard for people to find the music they like. We wanted to provide an avenue where users could discover new music, in the most natural way possible.
What it does
It uses facial recognition to determine the user's liking towards a particular song. The data accumulated can then be used to provide more accurate recommendations for the user.
How I built it
We use azure's facial recognition service to perform sentimentality analysis and the Spotify API to generate playlists and play music. It is built on a React frontend with a Flask (Python) backend.
Challenges I ran into
We had to resolve a CORS (Cross Origin Resource Sharing) issue with our Flask backend and we had to seek help from professionals to solve the issue.
Accomplishments that I'm proud of
We are proud that we realised most of our ideas in the limited time and successfully worked together as a team.
What I learned
We learned about the importance of collaboration between different people and the front-end and backend. Also, we learned the use of different APIs and platforms.
What's next for EmotionMusic
OpenCV can be used to decide the motion. For example, we can detect the direction of head shakes to determine if the user likes the music or not.