Inspiration
1) Got to play with expensive hardware 2) Machine learning is cool 3) Processing is also cool
What it does
When the headband wearer blinks, the visualization turns red/yellow/orange. When the user is sitting still and not blinking, the visualization is pink/purple/blue.
How I built it
The Muse Headband has Bluetooth support so I pulled data from it using Muse's muse-io and muse-player command line tools. I sent the data coming in from the headband to a Processing sketch that parses the incoming data and wraps it in an OSC message. Using a machine learning software called The Wekinator, the OSC messages are mapped to corresponding outputs, or classifiers, and when we add more samples the more accurate our visualizations become.
Challenges I ran into
While expensive, the sensors aren't actually THAT accurate. Had to take about 20,000 samples before it was even remotely accurate. Taking thousands of samples from multiple people helped with this issue and the program was able to continuously learn what blinking and non-blinking brain waves looked like.
What I learned
Machine learning is an extremely useful tool for artistic uses as well as technical.
Built With
- muse
- processing
- wekinator
Log in or sign up for Devpost to join the conversation.