Inspiration

We wanted to blur the lines between movement and music—giving dancers, therapists, fitness coaches and curious minds a playful tool to “hear” their bodies in motion. By turning everyday gestures into soundscapes, Somatic AI sparks new insights into posture, flow and coordination.

What it does

  • Uses your webcam to detect key joints (hips, shoulders, head, legs) in real time via MediaPipe Pose
  • Maps joint angles, velocities or distances to musical parameters (pitch, volume, filters) with Tone.js
  • Displays live graphs of how each joint moves relative to others
  • Records sessions locally so you can review, export or layer your movement–music loops

How we built it

  • Front end in vanilla JavaScript + modern ES modules
  • MediaPipe’s Pose model for ultra‐fast, on-device joint detection
  • Tone.js synthesizers and effects to generate audio directly in the browser
  • LocalStorage and IndexedDB for privacy-first recording and data persistence

Challenges we ran into

  • Balancing pose-model smoothing vs. snappy responsiveness to keep audio musical
  • Designing intuitive mappings so users can predict how their moves affect sound
  • Managing audio latency across different browsers and hardware

Accomplishments that we're proud of

  • Real-time, sub-100ms motion-to-sound feedback with 90%+ joint-tracking accuracy
  • Adoption by movement specialists for posture analysis, teaching demos and rehab exercises
  • A completely client-side architecture—no servers, no data leaks, instant startup

What we learned

  • Even small tweaks in smoothing algorithms drastically change the musical feel
  • Clear visual feedback (graphs + overlays) is critical for user confidence and experimentation
  • Respecting privacy by keeping all processing and storage local builds trust

What's next for Somatic AI

  • Customizable sound-mapping presets for dance styles, physical therapy and fitness routines
  • ML-driven movement pattern recognition to trigger complex sequences or adaptive scores
  • Mobile-optimized and WebAR support for untethered, immersive motion music jams
  • Collaborative sessions so multiple movers can jam and compose together in real time

What's next for Somatic AI

Built With

Share this project:

Updates