We wanted to experiment with the Muse Headband and see whether or not we can use its outputs to play a game.
What it does
Play a game of Pong using only head gestures, which are recorded by Muse Headband.
How we built it
Extracting output from Muse: To record real-time data from muse, we redirected the screen dump from Muse to the stdin of a python file, which in turn only recorded the accelerometer data. A snapshot of this data was then compared with a previous snapshot to detect movement direction (to the left, to the right, or no movement).
Connecting Muse outputs as a game controller: We mapped the processed Muse data to the performance of the game controller.
Challenges we ran into
Some recordings for the muse headband were not successfully recorded or were very sensitive to change so we were limited in muse inputs to use for controlling our game. It also took us some time to figure out which outputs to use and the significance of the data. At first we created the game on Unity but were not able to establish a 2 way TCP connection between our Unity game and our python script which processed the data. Then we tried to use Muse's IOS plugin but both the stimulator and the app failed to pair with the Muse Headband successfully. So we made the final switch to implement our game with pygame at 1am.
Accomplishments that we're proud of
After trying different ways to connect the muse output to our game's controller, we made our final switch of game implementation method at 1am and still managed to finish the project. Also none of us has ever used the muse headband before so it was a challenge to set up.
What we learned
How to use parse the raw data from the muse headband into a quantifiable value. How to make a game using pygame.
What's next for MusePong
Our original idea was to use the EEG outputs (brainwave activity) and TensorFlow machine learning API to train the game controllers to recognize when we think "left" or "right" in order to control the game. We would like to try to implement this.