A legit screenshot
A notha one
My brainwaves. ,,,, yup, its pretty cool, put them through a neural network too.
Basically, we took the stuff from a real neural network (brian), stuck em in an artificial neural network (AI) and did some wicked shit with it.
According to the US Census Bureau, nearly 129 million estimated kids across the world may have ADHD, a condition which makes focus incredibly hard. We wanted to develop a tool to help solve this problem, and saw the promise of using commercial headsets such as those from Muse as the perfect opportunity to develop a hack around. Using brain-computer interfaces, and machine learning for signal processing systems, we built an end-to-end web application that classifies if you are focused or not focused in real-time, alongside analytics to help you monitor your productivity across numerous webpages using a Chrome extension.
What it does
We use a web application built on a Python (Flask) backend and Firebase database to detect whether at any given moment if the user is focused or not. The web application takes Electroencephalogram (EEG) data from the Muse direct app, which are fed to a pre-trained neural network in real-time, which predicts whether the user is focused or not focused. We use a score for at any given moment which represents the user's focus levels out of 100. We also track focus scores across different popular websites, such as Google or Facebook, using a Google Chrome extension which monitors when the user changes tabs/windows.
How I built it
We take EEG data using a Muse headset and the Muse Direct app, which are sent to the application and Firebase database, before being fed through a pre-trained fully-connected neural network. The fully connected neural network achieves a test loss of only 0.7% across a balanced dataset of nearly 9200 shuffled EEG samples we took of focused and unfocused behaviour.
We take the output of our neural network (a score between 0-1, where 1=focused, and 0=unfocused), and multiply it by 100 to derive our 'focus score', the main feature of the app.
Using Flask, WebSocket, and Firebase, we built a web application (in JS) that is able to take in raw EEG data, preprocess it into structured, unlabelled data, and in real-time output prediction scores (focused/unfocused) for the user's EEG brainwaves at any given time. All the user has to do is connect the bluetooth Muse to the app, and you're ready to go!
Challenges I ran into
Figuring out how to deal with no support for developing BCI applications with Muse on Mac: We ended up creating a new approach which used the IOS Muse Direct App as a middle-man for collecting the EEG data and using Socket.io to load it into our computer.
Dealing with issues surrounding Firebase: Having to learn Firebase in a day was definitely overwhelming, but with the help of the Firebase sponsor booth we were able to use it!
Accomplishments that I'm proud of
What I learned
We learned a lot about deploying machine learning models onto web-applications, using Flask as a python backend, how to use Firebase, and we also learned a ton about building frontend with JS, as it was our first time using it!
What's next for Focus-Net: Using EEG Data and ML to Stay on Task
We want to focus on expanding the analytics page to include detailed Focus scores for popular webpages such as Youtube, Facebook, and Twitter. We also want to track focus averages day-by-day to allow for our users to get a sense of measuring improvement over time! In addition, we would like to continue to research and implement more efficient networks for accomplishing the signal processing/focus classification problem.