It’s no secret that the modern education system is heavily flawed. The vast majority of the time, how we assess a student depends solely on their ability to memorize then regurgitate information only to never use it again. If we want the next generation to be able to solve the problems facing our world students need to learn how to think, to be educated in a way that promotes interactivity, creativity and one’s individual strengths rather than their ability to fit into a mold. In order for that to happen, educators need to know how a student works best and what individually keeps their mind active, and in a manner that allows both the student and teacher to quickly change course if necessary. So I began to think, what if there was a way for an educator to tap directly into a student’s brain, see what keeps them engaged and constantly adjust their learning style to make it so that the student’s mind remains active and constantly thinking.
What it does
When paired with a muse headband, the app I built allows an educator access to a live visualization of how effective their teaching is and whether or not it’s going to stick with a student. The app shows a teacher a student’s alpha waves which are linked to how focused they are and increase in amplitude when a student is being passive or laid back. If the student becomes too unengaged with the material being taught, the teacher would receive an SMS notification as well as a pop-up banner to indicate that their teaching is ineffective and that they need to adjust course.
How I built it
The Muse headband works by establishing a bluetooth connection with a client, then passes EEG data buffers to it after measuring a user’s brain activity with several electrodes placed on the head. Usually this is done with the muse app for meditation, but in this case a thread is started which begins streaming data from the muse while the main thread of the program takes the streamed data and parses it to a numpy array. Then, using the different frequency bands of the signals that we want to pick up(since alpha, beta, delta and theta waves each tend to have frequency ranges) the amplitude of the user’s alpha waves is extracted. That data is then passed to a heroku python flask app by including it in the URL of an http request, after which the python script pushes the current data to a firebase database. In that same app, there’s a route that takes a user to a web page which, again with http requests to the app constantly pulls the eeg data to firebase and displays it using a chart.js graph. Should said data exceed a certain amount, the app makes a call to a twilio rest api function which creates and sends an SMS message to a teacher’s phone number to notify them of their student’s disengagement.
Challenges I ran into
Initially, when I approached this project I wanted to do what I had done before and use my modified star wars force trainer 2 to get my alpha wave amplitude. However once I began, it suddenly stopped transmitting data and wouldn’t even turn on the indicator light to show the EEGs were working. So, in order to get the project done I opted to borrow a muse headband from a friend of mine instead.Being the first time I’ve ever used a muse headband, I thought getting brain data would be as simple as using the official API, but as it turns out muse discontinued support for their dev api a while ago and wouldn’t directly provide developers with a way of recovering data from their headsets.
Accomplishments that I'm proud of
In order to get around the aforementioned problem, I tried dozens of different libraries and methods. From using PyMuse for Python to streaming from Mind Monitor on my phone over OSC to even installing a Windows VM on my laptop to be able to use BlueMuse. Eventually, after spending my first 24 hours trying to figure out how to make use of my headset I borrowed a linux laptop and was able to run MuseLSL without a bluetooth dongle and use its CLI to stream muse data, which I was eventually able to work into a python script. After that, I was able to build an entire web app around it and have it so that said stream can be viewed from anywhere in the world with the same efficiency as if I were monitoring it on my laptop in front of me.
What's next for EduVH
My big next step is moving the muse streaming element to the web app side of the project. Right now, in order for the setup to work the user needs to be running the python scripts locally which limits the potential user base as not everyone would know how to install muselsl or have a compatible OS. In the future, I’d like to integrate muselsl into my heroku app to make it so that anyone, regardless of technical knowledge could simply pair their muse to their laptop and have the web app do the rest.