We're Table #1.

Inspiration

Our inspiration came from all of those times we were with our friends, and wanted to listen to music, but couldn't agree on whose music to play. We wanted to create an app that bridges the gap between two different music artists to find a compromise that all parties can enjoy. This would be especially helpful if two friends had very different music tastes, something that happened to us quite often.

What it does

The application allows two users to find commonality between their music tastes. First, each user inputs an artist they like to listen to. Then, tables are displayed containing artists that are related to each inputted artist. The goal is to find a common artist by clicking on related artists until there's a match. Then we display that match on screen, so that the users can see an artist they're likely to both enjoy.

How we built it

Our first stage, design, was very thorough. When we started writing code, we had a good vision of the final product, and set up a github repo right away. Matt and Max started building the home and results page, while Oran designed the page visuals in Affinity Photo Editor. After a few hours of coding, frustration, relief, and break statements, we finished the pages and started working on the algorithm. It took us a while to figure out an algorithm that would do what we want and would be computationally feasible, but we eventually settled on an algorithm that took advantage of human intelligence rather than artificial intelligence. Oran worked mostly on that, while Matt and Max updated and modified the pages to work with the algorithm.

Challenges we ran into

Using CSS is always a challenge, and this project was no different. Our biggest difficulty with it was probably being able to position the background image(s) correctly. It took a lot of work to get them to properly line up with the other elements on the page, not create awkward whitespace, etc. However, our biggest challenge was discovering halfway through the project that the only possible algorithm we had to work with Spotify's API would go through 20^n operations. This was obviously terribly inefficient, and would take insane amounts of time to run, especially if the two artists were far apart in music genre. Our solution was to completely pivot on the original idea, and instead of having the algorithm automate the process of finding the path between two artists, we would have the users perform the task. This causes a decrease in automation, but we found that humans can do the process way faster than our current technology. (Also, we thought it was kind of fun.)

Accomplishments that we're proud of

The look/aesthetic of the GUI, proper display of elements using CSS, and our ability to deal with the above challenges efficiently.

What we learned

We learned a lot about managing a Github repo, and how to build a web application completely utilizing user data, rather than a database.

What's next for Music Blender

Mobile, bug fixes, more automation, genre information on artists (along with color coding),

Built With

Share this project:

Updates