Brain-computer interfaces (BCIs) offer the hope of communication and technology control without physical movement. Instead, voluntary brain activity is interpreted as commands to control technology. For this reason BCIs may soon help people with degenerative movement disorders, like Lou Gehrig's disease. Lou Gehrig’s disease leads to degeneration of motor neurons and with time can leave a person unable to move or communicate. For these people, the simple act of telling their spouse, “I love you” or saying, “thank you” to their caregiver has indescribable value, and seeing these people experience a human connection again is truly moving. One of the hurdles for practical everyday BCI use is the large, cumbersome display which interferes with sight lines and presents challenges to portability. further, such a display requires adjustment every time the user is repositioned.
Google Glass can form the basis for a new generation of portable BCIs offering a high degree of portability and customization. Our plan is to integrate Google Glass as a BCI display. We will use BCI2000 (an open source platform for BCI development) and connect to Google Glass. In addition, we will use Google Glass’s connectivity to Android-based smartphones to create context-specific BCI control options. For example, using the GPS libraries of the Android smartphone, we can provide location dependent commands. We also are integrating an Arduino to allow us to use the Glass to control external devices such as a wheelchair or home automation systems. Lastly, we are planning to Tweet or Pinterest a message using only brain activity!

We are currently in the process of receiving acceptance by the institutional review board. Thus, soon after the hack we can begin testing on populations who would most benefit from this revolutionary hack!

Share this project: