Inspiration

The inspiration from our project comes from the inconvenience of having to use a mouse/trackpad. Time and time again, we’ve had our concentration broken by having to stop and fumble with the point in order to get it to where we want. This made us stop and consider- Are mice even necessary? The answer- no. We realized that through computer vision, we could adjust the pointer as we pleased. This project was also inspired by those people who are incapable of using computers because they cannot use mice. These people, who may be affected by chronic diseases, should be able to have the same access to technology as everyone else. Although this app does not give them the complete use of their computers, we firmly believe that it is a step in the right direction.

What it does

This program offers all the functionalities found in computer mice, but it allows the user to use these using just their eyes. It continuously monitors eye movement and tracks the relative position of the eyes in order to move the cursor along with this. When a user blinks, the mouse “clicks.”

How we built it

The program is primarily based on three different libraries, face_recognition, dlib, and opencv. They interface after face recognition is used to detect the face, dlib is used to cut out the eyes, and the frames are processed using OpenCV. The live video feed from the webcam is decomposed into multiple frames which are processed by a Hough Circle Transform. Using this algorithm, the most “pupil”-like circle can be detected. It looks at the running averages of the last 15 pupil positions and moves the mouse cursor to that point.

Challenges we ran into

The hardest part was identifying the pupil successfully because of the fact that the eye is not regular at all. When we initially attempted to generate circles, we found that in a frame of just one eye, we had way too many circles present. We had trouble initially recognizing the pupil but when we finally settled on the algorithm we were able to do it successfully.

Accomplishments that we're proud of

We are happy to have made something that is both low-cost and sustainable. There are only two important parts needed to make this a reality- your eyes and a functioning web camera. Industrial/professional cameras use infrared rays that can be substantial damaging to the eyes. We are also proud of the fact that our algorithm is not discriminatory. Eye color, pupil size, and everything else is not relevant- all that matters is that there is a pupil.

What we learned

We learned how to use OpenCV and gained a lot of familiarity with it. OpenCV is a very powerful tool that we were completely new to in the beginning, but we were able to use it successfully. We also learned a lot about the eye. We use the eye for most our day but we don’t realize how it’s a precise yet imprecise tool at all times. We often take for granted how fast we are able to move our eyes back and forth and we don’t pause to think about it.

What's next for uEye

In the future, we hope to make a deployable version of Ueye that is reliant on fewer libraries and thus is easier to use for the average person. Alongside this, we hope to add new features such as scrolling and smooth out the motion of the eye.

Overall, we really feel as though people will be able to utilize UEye as an easy alternative to mice, be it through necessity or not.

Built With

Share this project:

Updates