Inspiration

Music is an enormous part of our lives. It helps us relax, it helps us sleep, it gets us excited. But there's so much music out there these days. Sometimes, it can be hard - what kind of song should we listen to today? That's one of the problems that we felt like we could solve, which led to the development of Mood Music.

What it does

First, the user needs to enable permissions on their Fitbit so live information can be passed to the web app. Mood Music involves a web interface, easily navigable, in which the user logs into their Spotify account. On clicking one of two buttons, the website communicates with the Fitbit API to obtain information about the user's heartbeat in the last ten minutes and then plots the past 24 hours of heart rate. Using the most recent heart rate data, an appropriate song is searched from the user's list of songs and suggested for the user to listen to.

How we built it

First, we split up to get our group: 2 of us tackled the Fitbit API, and 2 tackled the Spotify API. Our goal with the Fitbit API was to get access to the intraday heart rates, and the goal with the Spotify API was to enable personal Spotify user functionality and obtain the beats per minute of around fifty songs from the user. We had to first obtain the id's of the user's songs, and then get information about the song id's through another get request.

Next, we had to put the two together, and compare the heart rate from the Fitbit to the array of beats per minute from the Spotify API. Songs would be picked after the comparison that likely matched what the user wanted to listen to. As two people worked on combining these two, two worked on the web interface through which the user would interact. This was made using a bootstrap, CSS, and HTML5. Finally, the front end and back end were combined to make our application.

Challenges we ran into

We had some difficulty getting authorization to work for both Spotify and Fitbit, like getting the access tokens to get access to user data. The next hurdle to overcome was getting the Spotify song ID’s from a user, since the endpoint for the link was giving us some issues. One problem that we had difficulty overcoming was repeating get requests in a loop. Due to JavaScript’s asynchronous nature, the get requests were processed later while the function continued to run. With assistance, we resolved this using promise requests. The next few challenges involved integrating the back end and front end, as well as the Spotify and Fitbit functions together, but ultimately we were able to complete the challenges.

Accomplishments that we're proud of

We are proud that we were able to resolve these challenges and learn so much in solving them ourselves. Had we simply given up and immediately asked for help, we would have less understanding of how JavaScript works.

What we learned

Javascript, node.js, promises, asynchronous functionality in javascript, res API's (Spotify, Fitbit), Bootstrap, HTML, CSS, how to not sleep, etc.

What's next for Mood Music

Next, we plan to create a more complex model to predict songs that are appropriate for the user. Some functionality includes genre, artist, and lyric scraping. Additionally, we plan to give users the ability to enter their own mood manually if they don’t have a Fitbit or device to enter pulse, so we can still make music recommendations to them based on mood. This project has a lot of potential and we plan to continue with it in the future.

Share this project:
×

Updates