The three of us all use Spotify in our daily lives, working out, studying, living. When we came into the HackCMU, we wanted to do something amazing and extraordinary to change the way we used Spotify. We've always wanted to build an app, so we decided to build an easy-to-use app to help play music and use the robust Spotify API.
What it does
Synesthesia is the medical condition that causes stimulation in one sense to stimulate another. In our app, Synesthesia uses your physical touches and pressure from your fingers and translates that into a music playlist. It analyzes the frequency, intensity, and movement of your touches, and uses that to find a matching Spotify song.
How we built it
Synesthesia is built upon the Android Studio development and Spotify API. With the robustness of the Spotify API, we can easily combine these two into a working feature. With access to the entire Spotify database, we can analyze and play songs while users are accessing our app.
Challenges we ran into
All of our challenges can be aptly described as "things not going according to plan".
We came into this project wanting to use the Node.js and ReactNative platforms to develop the app. The first couple hours of the hackathon consisted of us trying to get the platform to launch. After a mentor made no progress in half an hour we realized we had to change gears or fail, and decided to use Android Studio.
Going into this project we knew of a couple pre-existing API specifically for apps that Spotify provided. We figured that by learning how to use these tools we would be able to develop our idea. But the only thing that the API provided was the ability to play music given an ID. We had to figure out how to connect to the actual Spotify Web API in real-time while using our app in Android Studio.
Accomplishments that we're proud of
This event was a huge personal accomplishment for all of us. To be honest, we did make a lot of simple mistakes and were clueless at the start, but the fact that we have learned from all these mistakes is the accomplishment. Now our team knows everything we've done, and have a solid grasp of Android Studio and using API's. If we were to look at this 24 hours ago, we would be lost, and that feeling of satisfaction is the greatest accomplishment.
What we learned
We've all become more familiar with Android Studio, Github, and using API's. We're now able to collaborate in real time, on an expansive project, and can replicate our results.
What's next for Synesthesia
We're trying to incorporate more factors into analyzing the physical touches. We plan to incorporate the motion and tracing the paths that could be drawn on the touchscreen. We plan to incorporate Fourier Transforms to help get a more accurate calculation of the BPM of the user. We also plan to incorporate machine learning, to help identify common melodies and patterns in taps, to help relate them to specific genres. For example, the difference between an electro dubstep melody and classical music.
We also plan on allowing users to incorporate their own Spotify accounts and be able to authorize access for them inside the app. We plan on improving the visual looks and design of our app, to give a more melodic appearance.