Inspiration
Two members of our group have focused more on bioengineering this year in terms of their classes and research. One member of our group has focused more on machine learning. Another member of our group has extensive PCB design and knowledge related to wearable patches. We decided to combine the current skillset of our team members to create a solution to a novel problem that solves both physical and mental health. There are many ways to control a wheelchair through a joystick, switches, keyboard, chin control, sip & puff, etc. However, these methods seem strenuous and difficult if you are paralyzed from the waist down or suffer from cerebral palsy. This is how we formed the idea to create a wearable patch that uses eye movement to control a wheelchair. We focus on creating a machine-learning model that can classify eye movement.
What it does
We created a wearable patch that uses eye movement to control a wheelchair. The eye patches on the left, right, and forehead use electrodes to gather EMG and EOG signals. The eye patches on the left and right of the eye collect EOG signals which will control the left and right movement of the wheelchair. When the user looked left the wheelchair will move left. When the user looks right, the wheelchair will move right. The forehead patch collects EMG signals or signals from the eye muscles like blinking. When the user blinks a few times quickly, this toggles the on/off of the wheelchair. When it is on and the user is looking straight, the wheelchair will move forward. When it is off, the wheelchair will stop/brake. We used NRF as the microcontroller for the patches. We also used ADS1299 to collect the EMG and EOG signals. We added Bluetooth to the patches, so they will be able to send the signal data to the motors of the wheelchair.
How we built
We developed a machine learning model to classify the left, right, and center movements of the eye. Because we did not have the time and ability to actually test this on a wheelchair, we used an Arduino to read eye movement data (left, right center), which controlled corresponding LEDs on a breadboard as well as corresponding changes in the wheel direction.
Challenges we ran into
We had issues making our proof of concept work. We had issues getting the motors to work correctly with the Arduino while taking in the eye movement detection data.
Accomplishments that we're proud of
We are proud that we completed a basic machine eye detection machine learning model. We were proud we were able to prove the feasibility of our idea with a proof of concept. We are also proud that we created a basic eye patch on Altium that can collect EOG and EMG signals.
What we learned
We learned a lot of things from this project. We learned how to create a machine learning model to classify eye movements. We also learned how to create a "proof of concept" of our wheelchair. We used an Arduino to read eye movement data (left, right center), which controlled corresponding LEDs on a breadboard as well as corresponding changes in the wheel direction. We also learned a lot about the physical and mental health of wheelchair users with limited mobility. We also learned a lot about how to design the eye patches on a PCB and what specific components would go on the patches.
What's next for EZ-Travel
If we had more time, we would actually implement this idea in real life. We would order the flexible PCBs, create the electrodes, and connect the whole system together to an actual wheelchair. We also would have trained the eye-tracking model on a larger dataset. We mentioned using Bluetooth to send signal data to the motors to control the wheelchair. We would have liked to get this to work as well, especially getting the translation from the muscles of the eye to the actual movement of the wheelchair.
Built With
- altium
- keras
- machine-learning
- tensorflow
Log in or sign up for Devpost to join the conversation.