Inspiration

Often times when people listen to music, they try and picture the meanings of the words and the rhythms in their heads. Geko trys to capture that energy.

Heat Waves

What it does

Geko allows users to search for any song and automatically generates a custom art piece that matches the vibe of the song based on its lyrics.

How we built it

We used Next.js to build a web application that allows users to search for songs via the Spotify API. Once a song is selected, we retrieve the lyrics from the Genius API and extract the chorus. This information is used as the input to our model.

When the lyrics are put into the model, the VQGAN+CLIP model connects the text to certain images and generates a series of images. Geko eventually returns an image to the end-user that catches the vibe of the song accurately.

Challenges we ran into

The model we used took a long time to produce an image due to the fact it had 3 neural networks running at the same time which made generating images on-the-fly not feasible

One challenge that we ran into was finding a model that would be able to output the types of images we were looking for. We also struggled to decide on which part of songs to use to generate the images as some songs have more descriptive words in the beginning while others have it near the end.

Accomplishments that we're proud of

We're very proud of the fact that we were able to get the model to even start running.

What we learned

We learned a lot about teamwork, using the Spotify API, and AI/neural networks.

What's next for Geko

We think Geko could be a more social platform in the future allowing users to share their creations and connect to their existing spotify accounts. More features that we plan include matching images to songs.

Built With

Share this project:

Updates