Inspiration
UB Talker is an ongoing research project at UB headed up by Dr. Kris Schindler. The goal of UB Talker is to provide an easier method of communication for people with ALS or similar disabilities. The goal of UB Talker is to provide the most effective method for communication that is currently possible. We see the ideal method as thought to speech. We want to use BCI (Brain Computer Interface) technology to make a computer recognize the the letter or word a person is thinking and be able to speak it. We are taking steps with the BCI, learning how the system and how the brain works to achieve this goal.
What it does
It allows people with ALS or similar disabilities to speak with minimal effort. These people can speak to their family and friends again using the BCI to trigger a mouse click in the click activated autoscan software.
How I built it
The autoscan software was created before the hackathon by members of the research team. The python code for the BCI(Brain Computer Interface) is based on the python SDK for OpenBCI. I added to the SDK to live process the serial data to determine eyeblinks in the data and then trigger a mouse click.
Challenges I ran into
Learning how the SDK works and what each script is for. I ran into some issues figuring out how the OpenBCI streams its data and figuring out how to filter it.
Accomplishments that I'm proud of
I'm proud that I was able to make this work. I have seen ALS firsthand and I know how difficult even the simplest tasks can be. Not having to move to communicate is much easier on people with ALS.
What I learned
I learned how to process basic muscle signals and how to use the OpenBCI.
What's next for UB Talker
UB Talker is always continuing towards our goal. We have completed the first step with the BCI today by successfully recognizing eye blinks and making our companion software respond to them. Next, is the motor cortex. Our hope is that wee can turn the thought of movement (e.g. moving your arm up) into a cursor motion on the screen. This will involve heavy research into the inner workings of the brain, more advanced signal processing techniques, and machine learning. Finally, we hope to be able to turn cognitive thought (e.g. thinking the letter 'a' or the word 'cat') into the speech of that letter or word.
Built With
- openbci
- python
Log in or sign up for Devpost to join the conversation.