I saw a Nerdkits tutorial on youtube for a robotic xylophone and was really inspired by the mechanics. Then I thought it would be really cool to integrate input of brain signals to alter what was played on the xylophone.

How it works

Currently working on building a frame to hold everything in place. Essentially, the xylophone moves along a linear rail by a stepper motor with belt and pulley system. The number of steps would be programmed out to stop at the right place for each note to be played. Where the rail would stop after a short distance to allow one of the two servo motors to swing 20-30 degrees and strike the note.

The brain signals output to a serial monitor, at once per sec, giving 3 values: signal strength, meditation and attention.

These data points could be thresholded after filtering out bad data points(ie no signal strength) so if the meditative data point outweighs the attentive value, this will decrease the bpm (beats per min) of the programmed song and vice versa.

Challenges I ran into

Piecing all the mechanical parts together.

Accomplishments that I'm proud of

What I learned

What's next for Robotic xylophone and EEG

Built With

Share this project: