Inspiration

Motor rehabilitation depends on repetitive movement practice, yet research consistently shows that patients struggle with motivation and long-term adherence, especially outside clinical settings. Traditional exercises often lack immediate feedback, emotional engagement, or meaningful context, which can slow recovery. SineWave was inspired by our Biomedical Engineering (BME) design course, where we were asked to redesign a game for individuals with physical disabilities. While many approaches focus on accommodating impairments, we wanted to explore how interactive systems could actively promote rehabilitation. Research in motor rehabilitation shows that rhythm and music—often modeled as continuous signals such as sine waves—improve motor timing, coordination, and engagement. SineWave builds on these principles by transforming hand movement practice into a smooth, musical, and expressive experience using only a webcam.

What It Does

SineWave is a rhythm-based hand gesture game where users perform hand movements in time with music represented as continuous, wave-like cues rather than discrete button presses. Players follow visual sine wave prompts and match their hand gestures to the rhythm, receiving immediate visual and auditory feedback. Using real-time computer vision, SineWave tracks hand poses and movement quality, focusing on rehabilitation-relevant characteristics such as movement smoothness, timing, pose stability, and tremor-like variability. This turns repetitive motor practice into an engaging activity while generating readable data that can be used to monitor rehabilitation progress over time.

How We Built It

  • Hand Tracking: MediaPipe is used for real-time hand landmark detection, enabling precise tracking of joint positions and motion.
  • Gesture Classification: A PyTorch-based neural network classifies hand poses and movement patterns.
  • Live Camera Feed: OpenCV powers the real-time camera interface and visual overlays.
  • Game Logic: Python handles pose smoothing, rhythm alignment, and core game mechanics.
  • Movement Analytics: Time-series movement data is recorded to analyze performance trends across sessions.

Challenges We Ran Into

Accurate real-time gesture recognition was challenging due to natural variability in hand positioning, lighting conditions, and movement consistency. Small variations in execution or tremor-like motion could impact classification accuracy, requiring normalization and smoothing of pose data. Another challenge was balancing engagement with rehabilitation value. The experience needed to feel fluid and musical while still capturing meaningful movement data aligned with rehabilitation research.

Accomplishments We’re Proud Of

  • Successfully integrating MediaPipe, PyTorch, and OpenCV into a real-time interactive system
  • Designing a rhythm-based experience centered on continuous movement, not discrete inputs
  • Capturing clinically relevant movement metrics rather than simple game scores
  • Creating an accessible system that works with only a webcam

What We Learned

  • How to integrate MediaPipe for real-time hand tracking and gesture recognition.
  • Leveraging PyTorch to train and deploy a neural network for robust gesture classification.
  • Using OpenCV to create a seamless live camera feed for real-time interaction.
  • Developing a data pipeline to collect, process, and analyze movement data for rehabilitation insights.
  • Debugging and optimizing the interplay between multiple technologies to ensure smooth performance.

What’s Next for SineWave

Live Analytics: Real-time feedback on movement quality and joint-level difficulty Additional Instruments: Expanding musical mappings to support a wider range of movements Progress Modeling: Training models to estimate rehabilitation progress using movement trends Clinician Dashboards: Clear, readable data views for doctors, therapists, and supervisors

Built With

Share this project:

Updates