What we did

We wanted to change the way people choose the music they hear. We wanted to drastically simplify the process, but still be accurate. Therefore we created PlayFace, a service which analyzes your emotions to choose appropriate music.

How we built it

We used different APIs to achieve different goals. We used the Spotify API to access music and music data. Furthermore we used the Microsoft Emotions API to get an analysis of the emotions of a given person. Then we bundled all of this into a web application.

Challenges we ran into

The authentication process for the Spotify API took us long. And the firealarm. That took also time.

Accomplishments we're proud of

What we learned

We learned how to use the Spotify API and we trained ourselves in all-night debugging.

What's next?

There are two things we look forward to. The first one would be leveraging PlayFace from a web application to a platform independent service. Furthermore we thought a lot of go deeper in image analysis and extract more features from a picture that just facial expressions.

Share this project: