Inspiration

We had a feeling that you could measure hemispheric activity with a simple consumer headset, and we wanted to see what we could do with it. Our inspiration came from the surprisingly large number of people who said it couldn't be done.

What it does

It measures electroencephalogram (EEG) activity from four areas on the forehead and divides it into hemispheric data, which our algorithm converts into a left-right visualization in real time, so that users can use biofeedback to learn how to control their brain, all while having no idea how they are doing it. It is an easy to use, nearly hands-free challenge that will lend a purpose to all that staring at your phone you do already.

How we built it

We used the Muse SDK for Android to create a platform for interfacing with the Muse headset through a lightweight app.

Challenges we ran into

Wrangling the software into talking to the headset.

Accomplishments that we're proud of

We implemented a visual biofeedback system for your brain, and then we turned it into a game!

What we learned

One of us basically learned Android overnight.

What's next for Look Ma No Hands

Our marketing team insists we can't tell anyone. But it's retro.

Built With

Share this project:

Updates