Emotions that we keep boxed-up and don’t express – whether we choose to or find it difficult to do so – loom to threaten emotional health down the road.
Some of us just aren’t good at expressing our emotions, or even special needs that makes it especially difficult for us, so others have difficulty relating with us (autism). Some of us are far from home or from friends for long periods, so we want someone to check up on how we’re feeling (astronauts). And all of us go through rough patches at some point in our lives, so it’d be nice to have a friend who checks up on how we feel (psych patients).
And all of these factors can eventually lead to people having a violent outburst or breakdown, and make us sick. We are convinced that emotional health is just important as, and in some ways even more than, physical health. The true remedy is network of people – friend and family who will check up on how we’re feeling, but we that is a privilege that not all of us have and certainly not all the time. What if, technology could be this friend that understands how we feel?
What it does
(Physiological mechanism) Whenever we think or feel, our neurons produce tiny electrical signals. And we can detect these signals just by placing some electrodes right onto the scalp. This is called electroencephalogram or EEG, and it paints us a picture of what is going on inside the brain. Because it’s such a powerful tool, it’s the gold standard in any neuro research from sleep studies to diagnosis of neurological diseases and coma state.
The device we chose is the Muse Headband. It is a portable 7-lead EEG monitor that just comfortably slips on, fits all, and can be kept on for long times. And by processing the EEG that the Muse generates, we can understand – or at least generally categorize – what you’re feeling.
The EEG measures voltage fluctuations, which have different amplitudes and frequencies. They look specifically at the gamma band, a specific range of frequency (“emotional evaluation”).
How we built it
(Data Transfer) The real-time data from the Muse Headband is sent to the laptop over Bluetooth. We used Node JS to “chunk” the data into 30-second segments.
(Signal Processing) We retrieve the EEG data from the Muse band, and extract the measurement of microvoltage measurements. The Muse SDK conducts band filtering on the dataset to remove extreme values and noises. Then, we conduct a Fast Fourier Transform (FFT), which gives us the amplitude and frequency of electrical signals at each data point. We average the amplitudes within the gamma wave frequency range, which results in a value called PSD (Power Spectral Density). We isolate the PSD values in the gamma band range.
(Baseline Establishment) According to numerous peer-reviewed research articles, it has been found that high PSD is caused by generally unpleasant emotions (low valency), and low PSD is caused by generally positive emotions (high valency). Each person has slightly varying PSD values, so for each subject we need to establish a baseline. The emotions studied (happy, anger, confused, disgusted) were simulated by showing video clips. Future PSD values are categorized into positive and negative emotions using statistical analysis.
(Server-client Model) After the signals are processed, the data is transferred to AWS EC2 Server. The server communicates with the client, which is the mobile app in this scenario. We created a Emicus.tech domain for our sponsor, .tech.
Challenges we ran into
Among the challenges we ran into were the signal processing aspect. The Muse Headband actually has a lot more variables, including accelerometer data. Isolating what we needed specifically was an initial challenge. Another major challenge was interpreting the EEG data. However, we were able to find papers and MATLAB toolkits that allowed us to perform these complex calculations such as FFT and PSD easily.
We were fortunate enough to be allowed to use the Muse Headband and Amazon Echo from MLH.
Accomplishments that we're proud of
What we learned
We learned how physiological phenomena are extremely complex, yet there are incredible tools out there from hardware, functions and toolboxes that allow us to peek into the inner workings of human physiology.
What's next for EMICUS
In the immediate term (days), we want to make the Epicus much more accurate. We could use these datasets to drastically improve our accuracy and sensitivity by using machine learning. We found a few research labs that publicly share EEG data categorized by emotions. However, they required us to return a signed confidentiality form to gain access, which was not possible over the weekend in a limited timeframe. And as the headband is becomes more widespread, we would have access to more EEG data that is specifically generated from Muse.
In the short term, we could improve our mobile app to introduce another stakeholder. For instance, in cases of autistic children or psychiatric patients, it would be important to also develop a monitoring application on the guardian or caregiver side via a secure pathway.
Also, we could integrate this technology to a smart-home environment. Based on the mood of the user, the physical environment could adapt spontaneously (lighting, music, etc.).
In the long term, we envision that EEG sensors will improve significantly. This will allow for a device that is very low profile that is even more user-friendly, and has a higher sensitivity and accuracy.