We all wanted to find a project that combines our passions for music and technology. On the tech side, we'd all been itching to get involved in an augmented reality project. We thought the perfect project to combine AR and music would be to create a music visualizer that enhances the music-listening experience for the user while still allowing them to interact with the environment around them. Shellhacks's diversity theme inspired us even more to pursue this idea, as we could see music visualization as a way to bring the awesome effects of music to the hearing-impaired. We're hoping that companies in the music-tech industry will continue innovating not only in the quality, but also the accessibility, of their products.
What it does
Our app is a proof-of-concept for a plugin to Spotify's app. We see users being able to launch our visualizer within the app for any song of the user's choosing. Since we obviously couldn't build this directly within the app, we work around this by having pre-set songs within our app. Users can select and place different shapes that transform with the beats of the song, creating a customizable experience that allows users to express their creativity. We also use Spotify's Web API to collect data on individual songs like "danceability" and "energy," which we could use in the future to enhance our app by using these metrics to control things like the intensity and the rate of change of lights in the surrounding AR environment.
How we built it
We used the Unity engine along with Google's AR Core kit for the augmented reality portion of the project. To gather the MP3 files as well as the data about songs, we collected data from the Spotify Web API using C#. Having all this integrated together in Unity, we compiled our project to an executable so that it could be run on an Android phone.
Challenges we ran into
Our biggest challenges involved integrating the various components of the projects in Unity. Originally, we used Python to interact with the Spotify Web API and analyze the data from the responses, but we later realized this would complicate the project since we need to compile the project to an executable to run on an Android phone. We ended up re-writing this portion of the app in C# so that it interacts more smoothly with Unity, which was a challenge since most of us didn't have prior experience with C# and it didn't have as many relevant libraries available as Python did.
Accomplishments that we're proud of
Creating multiple types of equalizers that move in sync with the beat of a song. This involved taking the data from an MP3 file for a song, sampling its frequencies, and applying mathematical formulas like the Fast Fourier Transform to present dynamic visualizations of the audio in augmented reality.
What we learned
We learned a ton about C# and Unity, as well as learning how to interact with and utilize data from popular APIs. We would've never known that Spotify's API sends data on aspects of a song like "danceability" and "energy" for all the songs in their database, so we learned to keep an eye out for unique
What's next for Spotifeye
Hopefully integration with the Spotify app! We imagine there being a "Start AR Experience" in the options menu for a song, where a camera will open up and allow users to place objects that will enhance their music-listening experience. See some of the images below for what this might look like with the way the Spotify app is currently laid out.