We wanted to come up with a system that could detect emotions and counteract what we perceive as "negative emotions" with music.
What it does
The Muse Headband SDK allows us to get EEG, accelerometer data which is then compared against self-tested values to determine the user's emotion. The YouTube SDK is then used to create a custom view that loads songs based on keywords related to the user's emotion. The user can also choose to share their emotions using the share button.
How we built it
We built it by integrating the Muse Headband's SDK and the YouTube SDK. We also took the help of and used a really great example that came with the Muse Headband's SDK in order to learn how to use their libraries to get the necessary data that we wanted.
Challenges we ran into
Reading data from the Muse Headband and analyzing it in order to come up with data that will help us gauge emotions.
Accomplishments that we're proud of
Learning how to obtain sample data by testing human subjects. Integrating YouTube's SDK and playing videos based on user emotion.
What we learned
We learned a lot about how EEG data is used by researchers.
What's next for Brainalyze
Expanding the app to include a service that analyzes brain waves when the user sleeps and also another version of the application that analyzes variations in the EEG waves so as to find out if the user is going through a panic attack or something of the sort in order to send the data to their therapist and their closed ones. However doing that would require for us to do a lot of testing in order to collect enough data to compare a potential user's data against. The biggest thing that Brainalyze could do would be to monitor the brain and its health completely and efficiently.