Inspiration

Remixes get 20 billion streams every month on soundcloud, with no monetization. Nobody is getting paid, there's no good way to track samples, and licensing is complicated.

At the same time, state-of-the-art machine learning techniques make amazing new remix techniques possible.

BlockTune

BlockTunes queries 7digital for pre-cleared major label music. User remix that music in the browser, in Ableton Live, or any supported remix tool. BlockTune tracks every sample with 100% accuracy. Remixes are published to Spotify or our streaming player. Streams are monetized with in-browser cryptomining. Revenue is paid out to major artists and remixers. No long negotiation deals. Just the music.

Label Content

We setup a 7digital account to query their database for the pre-cleared Capitol Records songs. We download music and artwork from 7digital and report streams. The songs are loaded for remixing.

Machine Learning Modules

We used Google Colab to run Linux virtual machines with NVIDIA GPUs. We used both tensorflow and pytorch, two deep learning libraries, to generate "style transfers" in two different ways.

1) A wavenet (Sander Dieleman et al) trained on a female speaker was instructed to decode a male singer (Kurt Cobain). The result sounded like a female voice singing "Smells Like Teen Spirit".

2) A gram-matrix style/content optimizer (Gatys et al) style transfer was used on audio spectrograms (Ulyanov & Lebedev) to generate an example of one singer (Kurt Cobain) singing another singer's song (John Lennon). A user interface was created for this.

AbletonLive

We started developing a program to analyze an Ableton session and tracks/verifies the content used in the session.

APIs

  • 7Digital

Built With

Share this project:
×

Updates