Inspiration

KinSnake was inspired by the idea of merging classic retro gaming with modern human–computer interaction — rekindling the intuitive puzzle-solving spirit we all had as kids playing Snake on early devices. Our team wanted to reimagine that nostalgic experience while introducing a natural, healthier way to play — removing the keyboard entirely and letting players control the snake using hand gestures through computer vision.

Beyond its fun gameplay, KinSnake encourages real physical engagement. The continuous hand movements required to guide the snake improve eye–hand coordination, reaction time, and spatial awareness. Repeated directional gestures subtly enhance finger dexterity and strength, while keeping users mentally sharp by combining quick decision-making with controlled motor responses. This makes KinSnake not only an innovative motion-controlled game, but also a light, health-positive activity that blends cognitive stimulation with gentle physical exercise — turning screen time into a more active and mindful experience.

How we built it

We built KinSnake during HackPSU to explore how AI, systems programming, and web development can work together. The project uses MediaPipe for real-time hand-tracking, translating gestures like pointing up, down, left, or right into snake movement. These signals are processed in Python, transmitted via WebSocket, and passed to a C-based motion controller using Windows named pipes for fast IPC.

On the frontend, a modern Next.js dashboard provides an immersive, purple-themed gaming interface with accessibility features such as high-contrast mode and screen-reader support. The 60-second challenge mode adds excitement and encourages players to beat their best score.

Throughout development, we learned how to combine low-level C programming with high-level web frameworks, integrate MediaPipe with Python backends, and synchronize real-time gesture input with frontend animation loops.

Challenges we ran into

Our biggest challenges included:

• Calibrating gesture detection to avoid false positives.
• Managing real-time latency between the camera feed, backend, and browser.
• Setting up inter-process communication between Python and C on Windows.
• Ensuring the UI stayed smooth and responsive while constantly updating frames and backend status.

Accomplishments that we're proud of

We’re incredibly proud of building KinSnake — a complete, end-to-end motion-controlled game that bridges AI, systems programming, and modern web technologies. Within a short development window, we successfully integrated hand-gesture detection using MediaPipe with a real-time WebSocket pipeline that communicates seamlessly between Python, C, and Next.js. We also achieved low-latency gesture recognition, intuitive motion-based control, and a beautifully animated Next.js UI that balances performance and accessibility. Our biggest win was watching the first fully functional prototype respond to real-time gestures — seeing the snake move exactly as our hand did was the ultimate proof that we had combined creativity, AI, and systems logic effectively.

What we learned

KinSnake taught us how to build across multiple technology layers — from low-level C IPC mechanisms to high-level AI frameworks like MediaPipe. We learned how to synchronize real-time computer vision data with frontend rendering loops while managing latency, precision, and data integrity. It deepened our understanding of Inter-Process Communication (IPC) via Windows named pipes, FastAPI WebSocket architecture, and React state management for real-time gaming experiences. We also gained valuable experience in user experience design, particularly in accessibility, ensuring that the game is inclusive and responsive to all players.

What's next for KinSnake

Our next step is to expand KinSnake into a full gesture-based gaming platform. We plan to add:

• Multi-gesture support (fist = pause, open palm = restart)
• Leaderboard and multiplayer integration through cloud WebSocket servers
• Cross-platform compatibility for macOS and Linux
• Gesture-based menu navigation for a completely hands-free experience
• Integration with AR/VR devices for immersive motion gaming

In the long term, KinSnake will evolve into a framework for AI-powered motion-controlled games, enabling developers and students to easily build gesture-interactive experiences using the same Python-C-Next.js architecture.

Built With

Share this project:

Updates