Disorientation in space is a real problem that causes 'space sickness'. Symptoms can last for weeks and include nausea, headaches, and fatigue. This hampers an individual's productivity and can strain resources while on the International Space Station (ISS).

What it does

This mobile app uses LiDAR to detect wall boundaries inside the ISS, to which a grid and object(s) can be overlaid on a device's screen. The application is used in conjunction with a portable EEG device that reads brainwaves and provides biofeedback to the user to help them adjust their orientation while in space.

How I built it

This app is meant to be built with with an AR and EEG-compatible devices. For the purposes of our demo, we used the Interaxon Muse headset.

Challenges I ran into

Our primary challenges were developing tests for such an application and managing our time to effectively connect all of the devices into one app. For testing in earth environments, we concluded that extreme environments such as deep sea and high elevation environments would be best for testing.

Accomplishments that I'm proud of

We learned about EEG and possible uses for biofeedback, and researched localization and orientation in microgravity environments. We were able to take real-time EEG signals to show the affects of earth gravity on the brain, and demonstrate how these could be used as a baseline for comparison in space.

What I learned

Applications like this one are relevant in other industries as well, such as medicine and mining.

What's next for Space EEG

Build a fully working prototype, and considering an extension that takes additional motor signal input (EMG).

Share this project: