Inspiration

The neurological explorations of Oliver Sachs intersecting with the rapid advances in computer graphics

What it does

Provides an abstract mapping of the body's electrical impulses into a VR environment.

How I built it

openbci --> Processing --> Python ---> FireBase ----> Three.js -----> Google Cardboard (+ Caffeine)

Challenges I ran into

The crotchety interactions of Processing and Python, the massive amount of data generated, and the spring physics involved in mapping the signals to a 3-dimensional space.

Accomplishments that I'm proud of

Translating a fundamental electrical language of the body into an abstract space in virtual reality.

What I learned

Patience, perseverance, and a lot of different coding conventions.

What's next for Eye Sing The Body Electric

Future work could involve lower latencies in measurement, more responsive visualization, and a cleaner framework for transmitting data from the openbci board to the browser.

Built With

Share this project:

Updates