🎮 Project Overview
We designed two playful, movement-based games to help people with Parkinson’s disease improve their motor control and balance:
- Naruto Kunai Dodger – dodge incoming kunai, track collisions, and encourage steadiness.
- Wii Punch Trainer – punch virtual targets in a timed sequence to build strength and coordination: punch them when both targets are green, avoid them when they turn red, and if they display “0,” wait for the next round.
## 🌟 Inspiration
- Naruto Kunai Dodger was inspired by the fast-paced, reflex-driven ninja training scenes in Naruto. We thought: “What if dodging kunai could strengthen fine motor control?”
- Wii Punch Trainer grew out of our love for the classic Wii Boxing mini-games. We realized that swapping pads for targets and focusing on precise punches could translate into meaningful hand-eye and upper-body exercise for Parkinson’s patients.
- My grandfather’s brother suffered from Parkinson’s disease, and we watched firsthand how this horrific illness slowly took him away.
## 🛠️ How We Built Them
- Platform & Language
- Lens Studio for Snap Spectacles with TypeScript for all game logic, asset management, and collision math.
- Lens Studio for Snap Spectacles with TypeScript for all game logic, asset management, and collision math.
- Core Mechanics
- Naruto Kunai Dodger
- Spawn 3D kunai “projectiles” in front of the player’s viewport using Spectacles
- Track real‐time head and hand movements to detect “hits,” updating a collision counter.
- Wii Punch Trainer (Spectacles Edition)
- Render dual floating targets in AR space; each round targets glow green (punch!), red (hold), or display “0” (wait).
- Use TypeScript routines to measure punch velocity and accuracy only when both targets are green; red-target hits and “0” rounds are filtered out.
- Naruto Kunai Dodger
- Data & Feedback
- Dynamic HUD overlays in Spectacles view showing collisions, and round status.
## ⚠️ Challenges
- Dynamic HUD overlays in Spectacles view showing collisions, and round status.
- Tracking Accuracy vs. Responsiveness
- Early Spectacles prototypes mis-detected head and hand movements, causing false “hits” or misses. We tuned our collision filters and gesture thresholds in TypeScript to strike the right balance.
- Early Spectacles prototypes mis-detected head and hand movements, causing false “hits” or misses. We tuned our collision filters and gesture thresholds in TypeScript to strike the right balance.
- AR Latency & Jitter
- Environmental lighting and Surface Detection variability introduced frame drops and jitter in kunai spawning and target rendering.
## 🏆 Accomplishments We’re Proud Of
- Environmental lighting and Surface Detection variability introduced frame drops and jitter in kunai spawning and target rendering.
## 🏆 Accomplishments We’re Proud Of
- Harnessed Snap AI for Impact: Built immersive AR games with Snap Spectacles and AI-driven features, making rehabilitation engaging and effective.
- On-Time Delivery: Met every milestone and delivered both product on time
## 🎓 What We Learned
- Teamwork & Collaboration: Coordinating across roles—from designers to engineers—to bring these games from concept to completion.
- Snap Lens Studio Mastery: Leveraging Lens Studio’s AR capabilities and TypeScript to build immersive, reliable experiences on Spectacles.
Built With
- fetchai
- flask
- html5
- lenstudio
- python
- snap
- spectacles
- typescript
Log in or sign up for Devpost to join the conversation.