Inspiration
The barriers to music creation have never been lower. With newer machine learning models even a novice user can create music without much experience.
One inspiration was from Black Midi style remixes. The black midi style celebrates the creation of music using an emulated instrument that is un-achievable without a computer.
We speculate that it will not be possible to fingerprint machine learning models using present-day waveform type analysis. Because of this limitation we believe that constructing an open system for publishing this type of "new instrument" can future-proof the industry against unlicensed usage.
What it does
Fidget machine uses pre-trained neural network models to capture an artists unique style of play. Eventually the idea is to extend using a python service that will capture performance and input as training data.
We think these types of tools will eventually be thought of as a new kind of instrument.
By publishing an index of these models to a public blockchain we can create an open network for discovery and collaboration beyond the confines of today's walled gardens (like Spotify/Facebook etc..)
How we built it
We played with a bunch of midi software to take stock of what is currently possible. Then we created a static web-app that is capable of interfacing with midi devices. The machine learning library that we made use of is Google Magenta.
Sticking with the idea that a target user has little musical experience and equipment, we chose to make a tool that can run in any modern web browser.
We then extended the application to support searching and publishing models using the Factom Blockchain.
The hardware we used during testing is Akai Professional MPK Mini MKII | 25-Key Portable USB MIDI Keyboard
Challenges we ran into
Initially we were interested in trying to define new derivative types of music sub-licenses - but ended up sticking to only what could be made under an open Creative Commons License.
One initial ambition was to use the provided music libraries to train new artist-models, but google magenta only supports training via the python library which is not usable via the browser.
Another item we had to remove from our planning was performance recording. This does seem to be possible but we ran out of time.
Accomplishments that we're proud of
Our app is fun and interesting to use despite being very simple. We feel like this application of blockchain technology is novel and forward looking.
We were able to POC integrated distributed identities which allows users to be authenticated while remaining pseudonymous.
What we learned
Google Magenta is awesome.
Midi in the browser is not only possible but powerful.
Nobody working in music at the hackathon wants to use blockchain to manage non-free music licences. The Music Modernization Act requires a central database and so blockchain need not be involved in commercial music.
Using the Factom blockchain without owning cryptocurrency is possible by utilizing the API's provided by the distributed group of companies that support and maintain the Factom Protocol. This approach allows an end user to make use of blockchain data as easily as any other API.
It's possible to make use of a midi controller using small/cheap 'stick' pc devices like the Asus Chromebit.
What's next for FidgetMachine
We're going to go home and play with our midi devices more :)
It would be interesting to reach out to existing communities that make use of music outside of the creative process (like https://www.fretsonfire.org/, https://freesound.org/, https://archive.org/ ) to foster adoption of this style of publishing-to-blockchain.
Built With
- blockchain
- factom
- javascript
- magenta
- nextjs
- now
- react
- webmidi


Log in or sign up for Devpost to join the conversation.