Inspiration

It's difficult to understand just how clouded our minds might be. Our perspective on what a healthy mind feels like can be lost as we grow accustomed to mental clutter and stress. One goal of this project is to show people what their mental state actually looks like, providing them a perspective free from their own personal bias. When we have an objective measure of our mental state, we can identify just how much room there is to improve. Our other goal for this project is to actually improve the mental health of our users by leading them through a guided meditation with unparalleled feedback in virtual reality.

What it does

The OpenBCI EEG brain sensor worn by the user transmits data in real-time. This data constantly updates the VR experience presented to the user via a Vive. When a user's brain is highly active and stressed, their world is cluttered with pulsating geometric shapes. As they calm down, the shapes shrink into dots that cease to pulsate. The user is guided through a meditation by text that arises in the VR world. The meditation guidance also updates in real-time based on EEG signals to give the user personalized advice based on the progress of their meditation.

How we built it

The system consists of three parts.

There's the hardware/firmware layer, which consists of the OpenBCI Cyton EEG headset and board. The headset measures voltage differences between different points on your skull and transmits them to a computer over bluetooth for analysis. It took a decent amount of trial and error (messing around with wires, flashing different versions of the firmware, etc.), but ultimately we were able to get the headset to stream reasonable data.

The second layer is a data processing layer. In python, we wrote some signal processing code to transform the voltage data to intensities in each of the prominent neural frequency bands: alpha, beta, gamma, and theta. Each of these bands is loosely associated with different states of mind. For instance, the intensity of the alpha band will rise when you're calm and relaxed.

A simple server makes this data available for our webpage, which uses it to modulate the parameters of a VR scene in A-Frame. Putting it all together, you have a visualization that evolves with the state of your mind.

Challenges we ran into

Our team consists of what were originally two separate teams who did not know each other at all. Seth, Brad, Stefan, and Ethan (undergraduates) came into the hackathon with an interest in exploring the use of an EEG and relating it to VR. Dmitrijs and Sabina (Biomedical engineering PHD's) came into the hackathon together also with an interest in EEG use. Bonding over our common interest in EEG, we decided to merge our groups. The process of team-forming and developing a common vision ultimately worked out very well, but it definitely created additional complexity in the beginning of the project that we had to work through.

The OpenBCI headset that we used did not come ready to use out of the box. We had to spend a ton of time wiring, programming, and testing it (repeat, repeat, repeat) until we finally could rely on it for live-streaming use. The OpenBCI software we found online also was riddled with issues that slowed us down as we attempted to get the EEG working on different computers, which was necessary because the first computer we installed the software on was not powerful enough to support our VR program.

Accomplishments that we're proud of

Processing the EEG data to transform voltage data into intensities.

Designing a virtual reality environment in A-Frame that is dynamic and changes in real-time.

Linking our EEG signals with our VR environment.

Creating an effective guided meditation with live-updating instructions.

What we learned

Flask, A-Frame, OpenBCI

What's next for MindSpaces

We want to improve the detail of our VR environment, hone in on more accurate EEG data, and create different modes (different kinds of meditations with different environments).

Built With

Share this project:

Updates