We started with a simple idea of controlling a computer game with blinking. We realized that adding gaze tracking to control a mouse would make the tool far more useful to those who are unable to interact with computers in the conventional way. After thinking the idea through, a teammate who remembered a scene from The Theory of Everything in which Stephen Hawking utilized an e-tran board to communicate. However, we saw the opportunity to digitize this tool.

What it does

U-Eye combines eye, pupil, and gaze tracking to accurately move the mouse around the screen. Additionally, we incorporate a classifier for lip-separation to add click functionality. We also include speech-to-text to allow for greater flexibility, as well as the e-tran board to allow for hands-free computer-human interfacing.

Challenges we ran into

The resolution of eye images from the camera was much lower than expected. This reduced the precision of our gaze tracking algorithm and made some applications non-viable. Additionally, we realized that blinking to click the mouse while also using gaze to move it was both uncomfortable and difficult. Our largest difficult was the conversion from pupil movements to onscreen coordinate movements - the final link connecting computer vision and mouse control. This had no established working algorithms publicly available, forcing us to build this final step from scratch.

Accomplishments that we're proud of

We are proud that we can track eyes, pupils, and gaze, to a high enough degree of accuracy to control a mouse with cheap off-the-shelf hardware. Our team used new APIs such as the IBM Watson Speech to Text API and the AWS Rekognition API for added functionality. Our team members learned jQuery to successfully handle interfaces between a Flask server and javascript and python scripts.

What we learned

Our group learned about state-of-the-art eye tracking algorithms to develop our tool. Additionally, our team had never written chrome extensions before - a task that proved to be more difficult than expected.

What's next for U-Eye

A main focus is improving our gaze tracking tool by continuing to iterate on algorithms that we use for tracking. We also plan on making our chrome extension and associated scripts more robust and user-friendly. Due to time constraints we were unable to integrate our face emotion classifier with our demo. However, we have a demo working independently and plan on including it in our tool's analytic applications.

Share this project: