Inspiration

Drowsiness while driving is one of the biggest public health hazards. According to a study, drowsy driving was responsible for _ 72,000 automobile accidents _, and up to _ 6,000 fatal crashes _ each year. We came up with the idea of measuring people's brainwave to avoid drowsiness-related accidents after reading research on how brainwave could indicate different stages of sleep and different levels of alertness. For example, a dominant alpha wave often indicates either drowsiness or lowered attention, and beta wave, on the other hand, indicates high alertness and great focus.

Therefore, we figured we can use the Muse headband, which gives us the wearer's real-time brainwave data, to detect drowsiness and alert drivers before they fall asleep behind the wheel and cause some irreparable damages.

The "Alpha" in the name of the app "Alpha Drive" comes from the alpha wave that we measures and plots in the mobile app, which is one of the most important metric in our brainwave analysis.

What it does

Alpha Wave is a mobile app that connects to the Muse headband via bluetooth and alarms to the driver when the brainwave indicates the driver is not alert enough. The EEG (Electroencephalographic) data is directly send to a machine learning model we trained using the Deep Learning framework from this paper: link and this github repo: link. The model makes a 90% accurate prediction on whether the driver is about to fall asleep (every 5 seconds, so the overall accuracy is even higher). If the driver is getting too drowsy the app would send a 10-sec alarm in high volume to alert the driver.

The app also makes a real-time alpha wave plot on its screen, and gives an alpha-wave graph report at the end of the driving session.

How we built it

We uses Swift to build the mobile front-end, where our charts are displayed using iOS-charts (https://github.com/danielgindi/Charts). For back-end and database (Digital Ocean), we used django for the most part, especially for communicating with the front-end, and also flask to communicate with the machine learning algorithm on Google Cloud. The Tensor Flow machine learning part is pre-trained (by us using rigorous dataset provided by the paper) on Google-Cloud using Recurrent Neural Network. It gets the raw EEG data from the back-end, makes a prediction, and send back the result. If the result indicates any dangerous sign (driver is sleepy), the back-end would tell the mobile end, which would alert the driver immediately.

(Add some specific)

Challenges we ran into

  1. We needed to deal with a lot of problems when we trained the machine learning model and tried to get the prediction working. The main reason is that the Tensor Flow version the paper used is outdated and has some serious dependency issues if we try to use more recent versions. We ended up having to figure out how and where the trained model is stored and parse the file out in order to make successful predictions. The process is very challenging and new to us, since documentation related to this particular project is very limited, not to mention the enormous size of the data and the high requirement for speed (which is why we chose to use Google Cloud to hand the training).

  2. The Muse headband documentation for iOS is written in Objective-C, whereas we need to make it compatible with Swift 4, which is used by iOS Charts. We need to deal with some bizarre bugs and migration issues when we code the mobile front-end.

  3. We need to fill in the knowledge gap needed for dealing with brainwaves. Luckily a mentor at LA Hacks pointed us towards the right direction at the beginning so we were able to find the academic resources fairly quickly. Understanding these resources and apply them in our app was also another challenge we needed to deal with.

Accomplishments that I'm proud of

  1. We successfully got the machine learning model working despite of enormous challenges we had to overcome.

  2. We had great back-end support with django and flask, which provided a solid foundation and cohesive force to our project.

  3. We migrated the Objective-C/Swift files to Swift 4 and were able to run Muse API with iOS Chart seamlessly together.

What We learned

  • Knowledge related to brainwaves, sleep stages, EEG, and alertness.
  • How to parse the model trained by Tensor Flow in deep learning
  • How to use iOS Charts and migrate Swift/Objective C languages
  • For the newbies (which is most of us), we learned a lot more about mobile development, reading and looking up scarce documentations, machine learning, HTTP requests, etc.
  • How to use smart wearables to change bring changes to people's life

What's next for Alpha Drive

Our vision for Alpha Drive is to apply the brain-wave detection technology into more fields like education, medicine, construction work etc. Possible scenarios for using our technology:

  1. Student : they can use the headband to locate exact when during the lecture their attention fall short, so that they can review or study more efficiently.

  2. Surgeons : the headband could alert them when the begin to lose focus during an operation so that they can take proper shifts to avoid medical accidents

  3. Construction worker/engineers : for people operating heavy machineries, they can track their attention level to avoid over-exerting themselves and poses risks to themselves and others.

Built With

Share this project:
×

Updates