We wanted a fun way to get relevant spotify playlists according to how you are feeling at the moment

What it does

You take a selfie and send it to a client. Then, our program will analyze the image using Google Cloud Vision API and detect any emotion present on the face(s). Based on the emotions it detects, we send a search query for playlists to Spotify using Spotipy (a lightweight python wrapper for the Spotify API). We randomly select a playlist from the one returned from the query and send a text message of the link to the playlist back to the user.

How we built it

We setup a flask server to run the program. Next, we added SMS integration using Twilio's API in order to be able to send/receive text messages to/from the user. We then experimented with Google's Cloud Vision API and implemented emotion detection for face images. To use the emotion values returned by the API, we had to properly parse the return object. Google's Cloud Vision API determines on a probability value for each emotion (joy, sorrow, anger, and surprise) on a scale of 1(least probable) to 5 (most probable). We specifically selected the emotion with the highest probability score. If two emotions had equal probability scores, we select one at random. After isolating the desired emotions, we associated them with specific search queries and sent get requests to Spotify using their API. Finally we again used the Twilio API to send a text message back to the user with the link of the Spotify playlist our program chooses.

Challenges we ran into

The biggest challenge (by far) was deploying to app engine so that the web app could be utilized without having to run the code on a local computer. This was because we had a difficult time getting app engine to save images and use them for mood analysis. We circumvented this problem by giving google cloud the raw image data. Due to our unfamiliarity with the Spotify API, we had a significant learning curve in accessing information from Spotify.

Accomplishments that we're proud of

it works... It is deployed on app engine, meaning everything works without any dependence on a local device.

What we learned

We learned how deploy an app on Google App Engine. Also we found out that Google's Cloud Vision API automatically encodes any input image type to base64.

What's next for Moody Mixes

Using Auto ML tool of Google Cloud Vision AI, we will in the future generate more relevant playlists utilizing more advanced image-based emotion analysis, by training and customizing the Google Cloud Vision AI for our needs. Additionally, we intend to speed up the process from a half-minute to a matter of a few seconds to get results.

Built With

Share this project: