Inspiration
For people with motor disorders, using a mouse can be difficult. Eye tracking technology makes using computers more accessible. However the technology can be expensive, or require extensive calibration. We wanted to make an easy to setup and use eye controlled mouse that could be calibrated in a couple minutes, and have enough accuracy to be useable.
What it does
Takes webcam input, extracts pupil position at different calibration points, and creates a function to convert them to pixel locations on the screen. Finally, the mouse input is connected so that mouse movement is directly controlled by eye movement.
How we built it
Gaze Tracking
We implemented our own gaze tracking methods using haar cascades. This was a simple and fast algorithm for tracking eyes. However we did not have the time to optimize the algorithm for accuracy so we used the Open Source GazeTracking library https://github.com/antoinelame/GazeTracking.git that uses dlib to map the face and track the pupils.
Gaze to Mouse Mapping
Gaze to mouse mapping is based on a simple calibration setup. The user is asked to look at several dots on the screen for several seconds. The position of their eye on the webcam frame was recorded and the the dot locations were inputed into a mapping function.
We tested several different methods of mapping gaze to mouse, including a linear regression model and polynomial regression model. However, these were too slow for real time applications. We ended up separating the x and y inputs and mapping them to the x and y pixels independently using two quadratic equations. This allowed for fast mapping and had high enough accuracy to follow the users gaze as long as they kept their head steady.
Improving accuracy
We applied a moving average function to smooth out the mouses movements with a frame of three. We found that this smoothed the mouse data without causing a significant lag. Additionally, we split the screen into a grid of 20x20 pixels. This decreased the resolution the eye data had to map to, thus reducing the jitter motion of the mouse. We found the 20-40 cell blocks improved mouse movement.
Due to the limits of using webcam the mouse has significant jitter making it hard to hover and click over any element on the screen. To overcome this we tried snapping the mouse toward elements on the screen that the user may be interested in. We did this by opening a webpage, using selenium to extract clickable elements (buttons, links and search bars) and calculating the distance of the current x/y pixel estimate to each element on screen. If they element is within 100 pixels it will be weighted toward that element. This decreased jitter near clickable elements and increasing the likelihood of successfully clicking an item. More testing needs to be done to improve this functionality, and thus it was not included in the main application. It was however included as a separate file called helperfunctions.
Challenges we ran into
It was hard to implement a pupil tracker on our own that worked well enough. The mouse tracker was also pretty jittery because our transformation from pupil location to pixel location during calibration led to strange polynomial fits.
Accomplishments that we're proud of
We got a semi-working pupil tracker (which we didn't end up using this time for accuracy), and a really nice-looking UI! We also implemented a lot of different functionalities in a pretty short time frame 1. pupil tracking 2. UI 3. Converting pupil inputs to direct mouse movements
What we learned
My partner and I attended this as our first hackathon and we learned a lot about version control and sharing across github since neither of us used it that often. It was also my first time making a UI which was really fun! When we tried to implement our own pupil tracker through haarscascade, we ran into a lot of issues but found out a lot about the thresholding and incremental "zooming in" idea of detecting pupils (face > eyes > pupil region). I also learned a lot about threading since we had to set up a UI that changed while another method made measurements during changes.
What's next for Eye'm Watching You - Eye Tracking to Mousepad
This project is a basis for a free, simple gaze controlled mouse. Future directions of the project could include:
- Apply filtering to decrease jitter
- Improve accracy by snapping the mouse toward elements of importance on the screen
- Use a more sophisticated mapping function (ml models, cnn, etc)
- Track head position and rotation and account for head movement in the mapping function so that the user can move their head without affecting the controller.
- Integrate a virtual keyboard
- Setup the system so it launches automatically and calibrates without requiring any physical input (currently calibration requires pressing the space bar)
Log in or sign up for Devpost to join the conversation.