Quadriplegia is the inability to move your limbs or your head - a condition often caused by spinal injury or muscular dystrophy. To make this condition more tragic, eye-tracking systems that enable even the most basic of requirements for independence, movement, are prohibitively expensive.
What it does
We built a wheelchair attachment that allows any motorized wheelchair to be modified with eye-tracking capabilities. By tracking a user's gaze and mapping the directions looked at into coordinates, the attachment we built will move the joystick of a motorized chair corresponding to the desired movement.
Specifically, the system is designed to move the wheelchair left and right when the user looks left and right respectively, move the wheelchair forward when the user looks down, and stop when the user looks up. Looking straight ahead has no effect on the movement of the wheelchair.
We also built a vehicle to model the wheelchair movement.
How we built it
We used a webcam with various computer vision technologies to detect where a user is looking. Then we converted that data into a movement instruction and used a TCP socket that we programmed to wirelessly deliver the data between the laptop/webcam to the microcontroller that moves micro servos in order to modify the position of a motorized wheelchair.
Challenges we ran into
We had to custom design a mount with a joystick and three micro servos that move in accordance to our inputs from the users' eye-tracking data. Additionally, we had to learn from scratch how to wire and use an H-bridge when we wired our electronics for the drivetrain (simulating the motorized wheelchair).
Another big challenge was creating the eye-tracking algorithm. We used inspiration from some online GitHub/youtube tutorials and ended up adding/developing various thresholds and normalization factors in order to effectively generate the center of the iris position relative to that of the eye, determining the iris-looking direction of the user.
We also had some issues integrating the eye tracking with our TCP stream initially, but after multiple tests, a system restart, network changes, and altering our basic socket, we were able to read in bit strings from our Arduino code from our generated eye-tracking data.
Accomplishments that we're proud of
We're proud of the effectiveness of our eye-tracking algorithm (>85% based on trial). Additionally, on the limited constraint of time, we were very happy with the way our custom-cut/constructed cardboard controller turned out in conjunction with the wiring of our drivetrain which both were difficult feats.
What we learned
From learning how to wire an H-bridge to understand facial meshing in order to produce instance segmentation for irises, we learned a breath in both the physical and software sides of computer engineering. More specifically, we also learned the computer vision process behind facial meshing, how to use sockets to transmit data over TCP, and various other communication protocols.
What's next for Eye Tracking Wheelchair
Going forward, we want to integrate this concept with a real electric wheelchair that supports a human's weight so that we can run real-world human trials and improve our existing design from user feedback. We will work with experts in the field of mobility aids and rehabilitation engineering to ensure that our design meets the needs of individuals with mobility impairments. Ultimately, our goal is to create a reliable system that provides greater independence and freedom of movement for individuals with mobility impairments.