Inspiration ✨

As a teen, I’ve noticed that some skills can’t be learned just by watching a video — things like knot-tying, hand exercises, or basic motor training. They rely on muscle memory, not just instructions. That inspired me to build HandScape: a tool where you learn by doing, guided by your own hand movements in real time.

What it does 🖐️🎮

Tracks your hand using computer vision in real time. Lets you interact with a gesture-controlled menu. Create maps by recording hand movements. Play maps, where the system checks if your hand matches saved positions. Includes preloaded maps (like “Hello” or knot basics) so it feels like game levels. 👉 It’s part tool, part game — built for accessibility, interactivity, and fun.

How I built it 🛠️

MediaPipe → real-time hand landmark detection. OpenCV → computer vision pipeline. Pygame → clean, interactive UI. Custom map format → saves & loads hand movements for practice or replay.

Challenges I ran into ⚡

Achieving real-time accuracy without lag. Designing a gesture-based menu that feels natural. Balancing the hackathon timeline while implementing map creation + playback.

Accomplishments I am proud of 🏆

Built a fully working hand-tracking game/tool in a short time. Created a scalable system where users can record their own maps. Designed a fun UI that feels polished and interactive.

What I learned 📚

How to integrate computer vision with interactive design. That simple, creative projects can stand out as much as big AI systems. The importance of making tech accessible and enjoyable.

What’s next for HandScape 🚀

Bringing HandScape to mobile (iOS/Android). Expanding the map library with more practical skills and fun mini-games. Allowing users to share maps with the community, like custom levels.

Built With

Share this project:

Updates