Inspiration

One of our team members was inspired by their mentee's project and wanted to help build an MVP for them. The idea is to determine the emotion of a user when listening to one song, then using that emotion to suggest a new set of songs for them to listen to.

What it does

Our Minimum Viable Product allows the user to (1) log in to Spotify, (2) search for a song, (3) get audio features for a track, and (4) get recommendations based on those features.

We also have a machine learning model to determine the emotion of the user while listening to a song. Using wearable tech, we can get the user's EEG while listening to a song. From there, we used Fourier Transform, Features Extraction (mean, standard deviation, power), and the K-Nearest Neighbors (KNN) for signal processing, analysis, and classification.

How we built it

We used React Native to build the mobile application, using the UI Kitten component library for the frontend. We used Spotify's API to add the functionalities to the app. The machine learning model was created using Python.

Challenges we ran into

No one on the team knew React Native prior to the hackathon, and there weren't any mobile developer mentors available to help us, so we had to struggle through that steep learning curve ourselves.

Accomplishments that we're proud of

What we learned

React Native!

What's next for Athena: Empathetic Playlists

We will connect the machine learning model to the mobile application and complete the user flow past the MVP. We will also improve the interface of the app to improve user experience and aesthetic.

Share this project:

Updates