Inspiration
We were inspired by elderly individuals who often face limitations when it comes to technology and physical activity. Many of them find traditional controllers difficult to use, and staying active can be a challenge. We wanted to create a fun, accessible way for seniors — and anyone — to stay engaged and move their bodies through games that require only a webcam and natural hand gestures.
What it does
Hands On transforms a simple webcam into a motion-sensing game controller. Players can interact with games like Vegetable Samurai(Fruit Ninja) and Flailing Goose(Flappy Bird) using only their hand gestures — no extra hardware or buttons required. This makes it not only accessible to people of all ages, especially seniors, but also eco-friendly by reducing the need for plastic gaming peripherals that often end up in landfills. It’s fun, intuitive, and encourages physical activity in a playful way.
How we built it
We built our games using Pygame and integrated real-time hand tracking using MediaPipe, a powerful computer vision library by Google. MediaPipe’s AI-based hand landmark detection enables the webcam to recognize and track hand gestures accurately. We used GitHub for version control and collaborative development, allowing us to experiment safely and track progress efficiently.
Challenges we ran into
One of the major challenges was optimization and performance — Python isn’t always the fastest for real-time processing, especially with video and AI workloads. Additionally, most of us had little to no experience with computer vision or AI models, so implementing MediaPipe and integrating it with a game engine like Pygame required a lot of learning on the go. Debugging gesture recognition alongside game logic was especially tricky.
Accomplishments that we're proud of
- We successfully built and demoed a fully working prototype.
- We went from zero knowledge of hand-tracking AI to implementing a real-time, camera-based control - system.
- Our team collaborated effectively to learn, adapt, and deliver a complete solution in a short time frame.
What we learned
We learned how to integrate AI-powered computer vision into real-world applications, especially in the context of gaming and accessibility. We also developed a deeper understanding of gesture recognition, event-driven programming, and team-based software development using GitHub. Most importantly, we learned how to take an idea from concept to reality through rapid learning and experimentation.
What's next for Hands On
- Improve accuracy and responsiveness by switching to faster languages or optimizing our current Python code.
- Upgrade the AI models for better gesture recognition under varied lighting and background conditions.
- Expand the game library to include more fitness-focused or memory-enhancing games.
- Make it fully cross-platform so it can run smoothly on everything from laptops to smart TVs.
- Collaborate with accessibility experts to tailor the experience further for senior and differently-abled users.



Log in or sign up for Devpost to join the conversation.