Why on earth did you make this?
Good question. My core inspiration came from my times at Algoraves, where performers make algorithmically themed music and visuals, with tools like Tidal, Foxdot and Hydra, usually live-coded during the event. Giving creative power to the spectators is a little strange, as they in turn are spectating their own performance.
What does it do?
The application is fronted by a little web-server on google app-engine, which makes calls to a serverless function, which puts note information on a pub/sub queue, a . It feeds a hosted music sequencer, which sends MIDI signals to a Korg Volca Keys, a 3-voice analogue polyphonic synthesizer.
Challenges I ran into
The biggest challenge was getting a node app to send MIDI to the synth. MIDI is an ancient technology, and as much as I would have liked to use OSC, I would have still had to send note data to the synth, which only takes MIDI. Ensuring I was sending note-offs to the synth was another challenge. The rest was usual cloud-wrangling.
What I learned
I learned and understood why MIDI can be an arse, but if I had time I would have liked to have been able to send control signals to control some other parameters of the synth. Due to me focusing on a working application, I didn't get a chance to configure the domain, or implement any kind of voting system. I was also going to ingress this data and feed a little NN, but given that users aren't entering sequences, but single notes it probably wouldn't be too interesting.
What's next for Negative Fifths
Probably standardise it for other devices which take MIDI in?