Inspiration
I got the idea from an Open Source Sunday challenge at slingshot, so I started learning openCV and made this
What it does
It detects your hand using openCV and mediapipe libraries, and using the hand landmarks data and an algorithm I wrote, it determines the hand gesture and executes the corresponding task.
How we built it
I used python machine learning libraries and wrote the basic code. Then I sorted all the landmarks data, which is given as a json file, and calculated for each finger if it is open or closed. Then I used pynput library to execute the task using keyboard shortcuts
Challenges we ran into
The biggest challenge was to make it accurate, and prevent the same task from executing multiple times. The code runs in a loop so that I can collect data through live webcam, so it kept executing the tasks multiple times, and I had to add a lot of things including some verification and confirmation to solve the issue.
Accomplishments that we're proud of
It runs smoothly and I was able to make it so that anyone with the mentioned python libraries and a python ide(I used sublime text) can just run my code on their device and it should run just fine.
Log in or sign up for Devpost to join the conversation.