Inspiration
As we entered Hard Hack, we started wondering around at different workshops that were present. As we were talking to different workshops, we noticed that one particular workshop had EEG and VR headsets, and since both of us are quiet interested in this topic, we chose to come up with a solution for the competition by approaching the headsets. After few minutes of discussion, we thought of creating an app that assigns brain waves of different actions to particular tasks alongside other features such as different sorts of therapies and particular physical characteristics of the location that the patient is being held at.
What it does
The app EasyCare is capable of assigning actions based on the brain waves of the patient.
How I built it
We started the process by firstly playing around with the helmet and detecting what physical actions can trigger a noticeable brain wave. After some trial and error, we noticed that the best organs to focus on are the eyes and the jaw(eye due to being a electric dipole, and jaw because clenching produced easily distinguishable EEG). In this way, we created three models of three activities, jaw clenching, repeated blinking, and eyes being completely close for a duration of time. Through these three actions, we have set up an interface that eventually chooses the task that needs to take place in regards to the patience. It could be asking for help, it could be asking for their favorite music playlist(music therapy), and... The steps to accomplish this task were using the EEG headset(DCI7), extracting the brain waves through Qstate(A program designed by WearableSensing as well), and proccessing them to drive a menu based application to help differently abled people and anyone else to access a screen hands free.
Challenges I ran into
The main challenge that we had to work through was actually utilizing the EEG waves that we were receiving. Due to lack of time and technology, it was hard to differentiate between brain waves in regards to different states or actions that we were doing. Organizing the brain waves in a somewhat sensible way was the biggest challenge that we had to face throughout the whole competition. There was also a very minor yet impactful problem that involved the mac port and the serial connecting to Arduino. We attempted to fix this problem for a while, but it seemed unsolvable(possible hardware issue) This problem impacted the additional features that we were planning to add to the app(temperature of the room, timer for feeding the patient, entertainment for the patient, and ....) The EEG streaming software and QStates took it's sweet time to work properly. There were version incompatibilty errors combined with erratic behaviour [non reproducable bugs] from the software.
Accomplishments that I'm proud of
The main thing that we are proud of is actually working with the headset and being able to somewhat create something that involved machine learning as well. For us, it was a fantastic opportunity to implement some of our ideas and see what errors we can fix and what things we need to work more on.
What I learned
We have learned lots of cool facts and features about the EEG waves such as the different types of waves(alpha, beta, gamma,...) and their meaning. I personally have also learned more on python and its features into processing signals and waves in general.
What's next for EasyCare
There are a lot of other things that can be added to EasyCare that we are not unfortunately able to due to the lack of time. However, our goal about this app initially was to create a next-level app that enables all patients, regardless of the illness they are experiencing, to communicate easier with nurses or whoever that is taking care of them. The next step would be organizing the display of the app and categorizing different information such as patients physical condition on the app and patients' preferences and hobbies. As far as what is next in regards to our EEG, we would have to collect more data in order to train our ML algorithms(Qstates) more efficiently for passing signals that relate to other activities. Like we mentioned, this is just one small demo that we could think of in span of one day, but with given time and more data, we would be able to extend our interface and create a more intelligent system.
Log in or sign up for Devpost to join the conversation.