Inspiration

According to the Christopher & Dana Reeve foundation, there is an estimated 5.4 million individuals with some degree of paralysis in the United States of America alone (that is 1 in every 50 people). Furthermore, it is estimated that 1 in every 4 Americans have some type of disability. While many aspects of life are often taken for granted, it is important to consider that there is a large population of individuals who may not be able to perform tasks; one of which is playing video games. A video game/arcade game almost always requires the use of ones arms. To accommodate for individuals who are unable to do this, we have attempted a solution that allows such individuals to overcome the conventional barriers and enjoy what many of us take for granted. Nevertheless, while this solution allows paralyzed/disabled individuals to play games, it is by no means restricted to them and is aimed to be enjoyable by all individuals. Our aim is to give the phrase 'use your brain' a whole new meaning.

What it does

Brain-Pong is a game that allows one to play the classical Atari game of Pong using one's mind and nothing else. The brain's activity is recorded using ElectroEncephalography (EEG) to measure the electrical activity of the brain, and focus feature extracting algorithm parses one's brain-waves (EEG) into a state of focus or relaxation. When the user is focused, the pong paddle moves upwards, conversely, when the user is relaxed, the pong paddle moves downwards. This game allows for a fun-experience for individuals afflicted by various conditions (such as paralysis, etc.) in addition to anyone who simply wishes to control a pong paddle using their brain.

How we built it

The creation of Brain-Pong involved both software and hardware components. The game of pong (Brain-pong) was made using Unity2D to allow for convenient integration into the website we built using HTML and CSS. The EEG signals are acquired through a data-acquisition system (which would be an OpenBCI where the data would be imported in BDF file and converted to a text-file), however for this project we designed it to be interfaced with an Arduino shielded by an ADS1299 Analog Front-End for Biopotential Measurement, using their example PCB layout in the datasheet. Using Python's PySerial library, we coded a script that allows us to obtain the readings from the Arduino(which obtains it through an SPI interface from the Analog-Front-End). We programmed an algorithm that then parses EEG input from a 2s windowed input into binary inputs; 1 or 0 which represent and upward or downward movement. This algorithm performs a Fast Fourier Transform on the 2s-windowed data and on 10s-initial data and then takes the fourier coefficients corresponding to Alpha-band frequencies 8-12, and averages it out to get a Focused Feature which determines the state of the input. These binary inputs are fed into the game and are the basis of control. The Unity WebGL build was then uploaded to a website.

Challenges we faced

It was challenging to emulate real-time input to initially test the viability of the algorithm and to integrate it with Unity. We ended up using data from Python MNE sample dataset to emulate real-time signals that would be input into the game. Furthermore, due to our inexperience in Web-development it was hard to integrate the Unity WebGL build. We were also working completely remotely from three different countries so co-ordination was definitely an issue as well. Finding an algorithm that was viable for real-time inputs was also a challenge.

Accomplishments we're proud of

We are extremely proud of how we were able to create an algorithm that could parse brain input into usable video-game input. Furthermore, we are also pleased with our design and integration of the website for a video-game and also for additional information. Finally, we are extremely delighted with our solution's ability to be accessible to a much larger population that the conventional video-game.

What we learned

We learned many things, such as how ICs interact with each-other (SPI interfacing), how an Arduino could be interfaced with Python (PySerial), as well as signal-processing/feature-extracting algorithms using libraries such as SciPy. We also learned how to co-ordinate it with Unity (Python Scripting for Unity). We also learned a-lot about creating websites, and how to go forward.

What's next for BrainPong

Future features include a more efficient and compact hardware (into a wearable EEG headset like the OpenBCI), in addition to more complexity in the video games. While our current algorithm only parses signals into two inputs, we wish to have more inputs in the future for additional information and maybe even games that surpass the complexity of Pong.

Built With

Share this project:

Updates