Inspiration
I wanted to make computer interaction more natural and touch free. The original idea was to turn any laptop screen into a touchscreen, but due to logistics it turned to virtually controlling the screen with your hand.
What it does
It lets users control the device with normal gestures
- Control cursor by index finger
- Screenshot by forming a fist
- Left Click by pressing thumb and middle finger
- Right Click by pressing thumb and ring finger
- Scrolling with Index and Middle finger togeather
- Drawing on the device with index finger
How we built it
I built it in Python using MediaPipe for real-time hand tracking, OpenCV for camera input and display, and PyAutoGUI to control mouse and keyboard actions. Gesture logic is handled through landmark distance checks and thresholds.
Challenges we ran into
Drawing on the screen was a challenge as index finger was doing two tasks at the same time.
Accomplishments that we're proud of
A working MVP has been produced.
What we learned
We learned practical computer vision integration, landmark-based gesture detection, real-time input control, threshold tuning, and how to connect vision models with OS-level automation.
What's next for AirTouch
Better gesture accuracy, more features and ability to add custom functions.
Log in or sign up for Devpost to join the conversation.