Inspiration
With the recent iOS 18 update, Apple has added the Eye Tracking accessibility feature, which helps impaired users navigate their devices by emulating user inputs with their eye movement. I thought this feature was awesome, seeing how impactful such advanced technology could be towards making tech accessible for everyone.
However, while Eye Tracking is available for phones and tablets, this amazing feature does not exist for computers and laptops. This project provides the first-of-its-kind accessibility support for motor function impairments on all desktop devices. While other software tools exists for visual or audio impairments, software for motor function impairments are unheard of-- one would have to purchase extra hardware (like expanded keyboards or adaptive switches) just to use something we all take for granted. Not anymore!
What it does
Easy Ice Cream Vision runs in the background of the user's device. It records their webcam and tracks their eyes and pupils with a highly-trained image processing model. Using the pynput library in Python, this program 1) emulates cursor movement in the direction of your gaze, 2) emulates clicks when the user blink, and 3) emulates long presses and releases when the user winks.
It is not necessary to have the window open while using your device. Easy Ice Cream Vision quietly responds to eye movement while you use your favorite desktop applications.
Note that the webcam recording is mirrored so it looks like the cursor is going the opposite direction than I'm looking.
How I built it
This program is independently-developed and written entirely in Python 3. The core of the software is the image processing library GazeTracking, which I used and modified for my own program. Cursor emulations are performed with the pynput library.
Challenges I ran into
The hardest challenge in this project was tinkering with the sensitivity thresholds for the detection of different eye movements/actions. I did not have enough time to gather and label data frame-by-frame and perform supervised learning using a decision tree classification to decide these thresholds for me, so I just experimented with different values.
Accomplishments that I'm proud of
For the time and resources available to me, I am quite proud of how accurate the eye tracking aspect of the project is! I think this type of technology, although not nearly close to being fully developed, could help a lot of people if more experienced engineers than myself were to build upon my idea.
What I learned
Learning, understanding, and implementing a complicated library like GazeTracking was definitely a challenge for me, especially since I chose to edit the source code of library itself (locally) for my own application. Professionally-developed products have so many nuances that I am still yet to fully understand.
What's next for Easy Ice Cream Vision
Making the eye detection and behavioral classification more accurate!
Log in or sign up for Devpost to join the conversation.