Inspiration

The ability to control movements of everyday objects through simple hand gestures or mental processes is an intriguing concept made popular in the realm of comic books and science fiction. It is a super power many may have fantasized over as a child or still wish they could have at any age. Through the use of Electroencephalography (EEG) the idea of controlling physical components with the body and the brain is an emerging technology with many applications, especially in biomedical engineering and computing. A simple proof of concept that illustrates EEGs ability to detect differences in brain activity is the BioHack Bot. Our project applies the powerful concept of EEG into a simple and fun robot car, that anyone can use and control with their mind.

What it does

In short, electrical data based on a user's brain activity and head movement is gathered from the Muse Headband. This data is wirelessly transmitted to a computer which analyzes the data to derive controls for the robot car. Another signal is then used to communicate the driving instructions over radio frequency to the robot car, which should move accordingly in real time. A live video feed was also set up to that the user can track the robot’s location and line of sight without having to be physically next to the robot car.

Basic Robot Controls Hard Blink: Switches the robot between forward and reverse mode Left Head Tilt: Turn Left Right Head Tilt: Turn Right

How we built it

Initial steps included the disassembly of a simple remote control car. The original motors were rigged to be controlled by an Anarduino type Arduino with RF capabilities. This later enabled us to relay control messages from our base station to the robot via radio signals.

The next step of the project was harnessing brain activity monitored by the Muse headband to move the robot in some manner. The Muse headband has four main sensors used for data collection, two on the forehead and two near the ears. After looking at the plotted EEG data it was decided that the average raw EEG data from the four sensors could be combined with a “hard blink” or “forehead scrunch” for a control variable. It was decided that this “hard blink” motion would control the forward and reverse motions of the robot. Accelerometer data was used to create additional control variables for moving the robot car. A left and right head tilt controls the left and right turning capabilities of the car, respectively.

The data from the Muse Headband was exported from Open Sound Control (OSC) into Processing. The Processing script determines the movement of the robot based on the raw EEG and accelerometer data. Serial communication between Processing and the Arduino was set up so that the control information could be transferred to and control the robot car.

Challenges we ran into

This multifaceted project required the communication and collaboration of many different components which created several challenges. We spent a great deal of time determining the best design and set up for the communication between the Muse Headband and Processing which then communicated with the Arduino IDE and ultimately the robot car itself. Once we successfully could transmit data between all the components of our project we also faced several control issues. It was sometimes difficult to get the robot to move in the way we wanted. We worked through several program issues to the point where we felt most issues are hardware setup related.

Throughout our work we had instances in which the BioHack Bot appeared to be doing what we wanted and we felt confident that our project was heading in the correct direction. This also provided challenges because each time we had a breakthrough, upon continuing work on the robot it would occasionally start doing things we did not want it to do. Consequently, we would have more problems to debug in order to continue making progress on the project. The best example of this is that at one point we were actually able to control the robot using the Muse Headband and it worked! That is until we broke it again!

Accomplishments that we're proud of

Throughout the entirety of our project we faced a countless number of roadblocks and unexpected issues that we were able to work our way through. We are proud of the progress we were able to make given the time constraints and overall enjoyed building the BioHack Bot. We had low moments, when we had no idea if we would be able to steer the project in the direction we wanted. For each of those instances, we also had moments of achievement. For instance when we finally got the data to be read correctly through the different components of our project and the one instance when I could actually control our robot with my mind. The BioHack Bot is in no way perfect, but we can confidently say that through our trials and errors we know this concept is possible if more time and dedication is put into it.

What we learned

Through this experience we learned about the importance of perseverance and teamwork when working on a project with strict time restrictions.This is especially true when everything that can go wrong, does go wrong. On the technical side we learned an ample amount about the communication between different electrical components. We learned about the importance of robust design so that hardware and software problems may be minimized.

What's next for BioHackBot

During our time working together we got far enough along in this project to prove that our concept of the BioHack bot is possible.We could definitely spend more time refining what we have already created so it works more effectively and has a better design.

Given more time it would be interesting to see where this project could go and what other technologies we could incorporate into it. One of our main goals was to facilitate control of the robot using both electroencephalography (EEG) and electromyography (EMG). EMG is a very similar concept to EEG, but measures muscle activity rather than brain activity. It would have been very interesting to combine both EEG and EMG into one final product to demonstrate full body and mind control of the robot. Due to time constraints it was easier to simplify the project so that that only the EEG measurements from the Muse Headband were used.

Share this project:

Updates