Inspiration

Facebook announced using brain computer interfaces for typing. I wanna try the same thing, at TechCrunch Disrupt!

https://techcrunch.com/2017/04/19/facebook-brain-interface/

Give the limited capabilities of the openEEG-smt device I decided to limit myself to eye blink detection rather than something purely brain driver.

What it does

An openeeg-smt headset detects blinks (using a simple thresholding algorithm written by me). The blinks are used to determine keypresses. The keyboard can be used to send messages to a 2 way chat powered by PubNub or to control a telepresence robot.

I wrote a python flask server which serves a local web user interface and connects to the openeeg usb device. It reads the openeeg protocol and looks for large voltage fluctuations indicative of a blink

In order to make the keyboard more efficient, a character prediction RNN (from this open source project: https://github.com/sherjilozair/char-rnn-tensorflow, trained on the HackerNews comment dump https://archive.org/details/HackerNewsStoriesAndCommentsDump) is used to provide predictive text input.

How I built it

The python app consists of a flask+flask_socketio server connected to some multiprocessing modules. There is one multiprocessing module for collecting EEG data, another for processing the EEG data, and another for running the tensorflow char prediction model.

The signal processing is a simple hardcoded threshold compared both to zero and also to a moving average.

The tensorflow char prediction model is from https://github.com/sherjilozair/char-rnn-tensorflow, trained as a 4 layer 512 NAS cell RNN.

The user interface is plain HTML/JavaScript using pure-css.

The webrtc frame is https://appr.tc

How is PubNub used

PubNub is used in two ways: to share the text from the user's computer to the robot display, and also to share the name of the apprtc room use for sharing video stream.

Challenges I ran into

I would have liked to do a more traditional BCI such as SSVEP but I didn't have time. And WiFi caused problems, again :-(

Accomplishments that I'm proud of

Getting something useful done in 20 hours. Combining both the text input and robot remote control was nice.

What I learned

OS-X USB serial drivers require a bit more care than their linux counterparts.

What's next for openeeg-enable

TBD.

Built With

Share this project:

Updates