We wanted to create something with real-world use, and that provided a social good. So, looking at the hardware list, we saw the muse headband and were inspired to use real-time brain wave and stress level data to alter a user's sense-experience and mood, helping manage things like stress and anxiety related mental health problems.
What it does
Destress Cyborg uses Muse to read the users brain activity. If the user has high levels of unproductive brain activity, Destress Cyborg blocks those things that are stressing the user out from their view.
How we built it
We built Destress Cyborg by training our Computer Vision model on Google Cloud Compute Engine to more accurately detect things like food and people. We then used brain wave data to determine whether a user was stressed and our program would blur those objects from the user's view accordingly. We put the project on a DrangonBoard 410c so that it can run portably.
Challenges we ran into
The biggest challenge we ran into was that Muse took down their SDK kit, so we were unable to integrate real-time data into our project and instead had to record brain wave data and record video separately and then integrate the two later. It was also hard to determine when the user was stressed or just concentrating. For example, Beta waves indicate fast mental activity and complex thought processes, but they also are present during times of stress. However, it's not impossible to piece brain-wave data out to figure out if a user is thinking productively or destructively; it's more that we are not Neuroscientists and don't have much practice doing that.
Accomplishments that we're proud of
We are proud of the progress we made and our attempt to make something so cool and Black Mirror-ish. We also think that it's pretty cool we were trying to create something to improve users' mental health and wellness and that could be a significant improvement to peoples lives!
What we learned
We learned that Neuroscience is surprisingly hard to understand, that Computer Vision is pretty heckin cool, and that you should always (ALWAYS) appreciate a good SDK.
What's next for Destress Cyborg
We think it might be worthwhile to think smaller in the future, and by that, we mean turning our focus from the real world to the virtual one. Once we can figure out how to stream real-time data we have ideas of creating a program that can track a users' stress as they do things like work or look at social media and then alter their desktop environment to manage that stress level. To do that we would have to work out webpage tracking, user data tracking, eye tracking, and accurately reading brain activity data. From there we could branch out into AR kits for things like real-world, real-time stress censoring!