To help those with autism, bipolar and other mental illness have an accessible tool for getting into a better mood with music selected based on facial expressions.

What it does

Uses a custom algorithm to select music from Spotify data and correlates songs based on visual data from Microsoft Azure.

How we built it

Using React.js, we implemented a website that uses the laptop camera to figure out the user's current facial expression from Microsoft Azure API. Then we run this data along with Spotify music data through a custom algorithm to figure out which song would best suit the mood of the user.

Challenges we ran into

Spotify's API was too restrictive for us to modify and change its music.

Accomplishments that we're proud of

We made a working app that can change music based on facial expressions.

What we learned

React, Redux, Music Recommendation Algorithm design, Visual AI APIs.

What's next for MoodSwing

To make MoodSwing a featured app that can be used on a daily basis for studying, working out, relaxing or working.

Built With

Share this project: