Alvin Lucier's experiments with music composition through EEG.
What it does
I am taking alpha wave measurements from the Muse API on two separate Muse headbands and sending that data over UDP to Max. In Max, both data streams are reduced to the individual alpha frequencies of the participants and translated from relative values to an integer more useful to the synthesizer. These are then sent to individual Arduinos where they are passed through a basic RC filter, converting the digital number to an analog output. This is then routed into a Moog Werkstatt into the voltage controlled oscillator and voltage controlled filter which dynamically change based on the level of focus in the meditating participants.
Challenges I ran into
I seem to have fried an Arduino which no longer registers in my laptop's serial ports. I spent a lot of time trying to fix this before just getting new Arduinos.
What's next for MindSynth (Analog)
I would like to loop multiple recordings of minds at the event in order to create a sound portrait of the mind's of hackathon participants