Inspiration

We had the intial idea to explore creating music with machine learning and once we named our team 'burger' we thought it'd be fun to make a burger based UI.

What it does

It uses a a state-of-the-art, modern, burger-based interface that users can create a seed for generating a drum beat using magenta's DrumsRNN. When playing that drum beat on loop, each iteration we have ImprovRNN create a midi-based melody to accompany it. Users can also modify various features like changing the drum audio used and the synth used for the midi melody.

How we built it

We created a Apache web server on GCP and used Magenta.js so we could keep the entire project on the front end. Magenta powers the machine learning that creates the notes, but Tone.js makes it possible to hear it with a midi synthesizer and the ability to map different sound sets to notes.

Challenges we ran into

When used in excess, Tone.js effects can distort and create artifacts in the audio. To help we increased the buffer so there is a minor delay when hitting play and stop. Additionally, we disabled the effects by default so the performance of the beat burger was not affected. Domain.com also gave us a hard time routing our domain to our public IP, but eventually it figured itself out.

Accomplishments that we're proud of

Creating a purely burger themed interface to create unique, pleasant sounding music using machine learning.

What we learned

We learned about the power of git, working with audio in javascript, and using machine learning models effectively.

What's next for Burger Beats

Future optimizations to allow for stable effects as well as adding additional instruments (bass, vocal, guitar).

Built With

Share this project:

Updates