Why the Muse?

We saw the Muse in the available hardware section, and thought that its advertised application - meditation - couldn't be the only thing that it could do. When we started to look at the raw data that we were getting from it, we realized that we could use it to identify when a person had moved their eyes! One of the many possible applications we saw was rudimentary eye-tracking: which could be used for the control of a computer by a quadriplegic, or otherwise motor-impaired individual.

Where did we go from there?

We decided to build a basic game that could be controlled by moving your eyes, as a possible hands-free entertainment option, as well as proof of concept. The Muse SDK didn't contain anything that would track eye movement (it mostly allowed access to the data from the Muse), so we knew that we would need to build eye-tracking code ourselves, using the raw data from the Muse. We also settled on creating a simple game that would generate a maze, and just allow the inputs left, right, up, and down, to move a ball through the maze one square at a time.

How did we build it?

We used python, and the Muse SDK to gather data and interpret signals - either the four basic signals listed above, or a blink (which we used to close our testing window.) The maze is generated by Wilson's algorithm, and rendered by PyGame.

Biggest Challenges

The Muse is, in many ways, a finicky piece of hardware. The nature of the data we were reading meant that we needed all of the sensors on the Muse to consistently make contact. Furthermore, the Muse had to be calibrated for each individual - and sometimes, we found differences each time the same person put it on. Most of our testing was done by the same person, whose most reliable calibration values we discovered early on in our testing. After creating a module that allowed us to get basic inputs from the muse, and integrating it with the maze, we created a calibration module that allowed a second member of our team to put on the headset, calibrate, and solve a maze in under 5 minutes.

What do we think about what we've made?

It worked! It worked! It worked! We like it. Our calibration worked, ability to input commands was fairly reliable, and we even had a game to play with what we developed. In the image gallery, you can see the stages of solving a maze, as well as a window that shows the controls that were input (the green square moves around in the box). We're excited, and proud of it.

What we learned

Working with hardware is hard. Very hard. Every piece of code has to be bigger, and more unwieldy, to account for the possibility of input errors, which can throw off control of our maze, or any other application that we might attach to the headset.

What's next?

Overall, to continue this, we would love to improve the reliability of calibration and control. We would also like to be able to detect absolute position, which should be possible on a large enough surface. That would be a wonderful thing to add to our module. We would also like to integrate it with other games, or even more useful programs (speech synthesizer, other entertainment, finer computer control).

Built With

Share this project:

Updates