Inspiration
I got inspired to build this program while playing around with the various functions of Mediaflow pipeline. I wanted to help the world of computing by making it more accessible.
What it does
Goldeneye helps a person use their computer without the use of their hands or mouth. It can be used by just moving their head and eyes.
How we built it
Goldeneye uses iris tracking to detect a person's gaze and then uses pyautogui to move a mouse cursor based on simple math that ensures a dead-zone and but also makes it easy for the cursor to travel long distances.
Challenges we ran into
The initial calibration of the program is rather tricky to do and the limited zoom and camera resolution of my laptop webcam makes it difficult to be precise.
Accomplishments that we're proud of
I learnt how to persevere despite an army of bugs and actually finish the product.
What we learned
I learned how to use OpenCV, pyautogui, the fundamentals behind mediapipe and various image classification algorithms.
What's next for Goldeneye
A more sophisticated camera system so the program can be controlled even without moving the head and a voice assistant system to make it completely handsfree.
Log in or sign up for Devpost to join the conversation.