Inspiration

After passing 25, I realized I wanted to develop a new hobby, something creative that would also get my body moving. I chose dance.
However, even in beginner classes, I struggled to keep up. The instructor would move on before I could fully process the combination, and subtle timing mistakes, being slightly late on a beat or missing a transition, quickly snowballed.
After class, I tried to practice on my own using recorded videos. That’s when a new frustration appeared: I spent more than half of my practice time just trying to align myself with the video,pausing, rewinding, guessing whether I was early or late, rather than actually improving my movement. I knew something was off, but I couldn’t tell what or when.
As a software engineer, this felt like a solvable problem. Large multimodal models can already understand video, motion, and timing—why not use them as a personal dance coach? Instead of manually syncing frames and relying on subjective judgment, I wanted an app that could objectively analyze movement, detect timing errors at the millisecond level, and explain exactly how my performance differed from the teacher’s.
That motivation led to AI Dance Coach: a tool built to turn frustrating solo practice into precise, actionable feedback, so learning dance becomes about improving skill, not fighting the video player.

What it does

AI Dance Coach analyzes dance performances with millisecond precision:

  1. Auto-sync by music
    Upload teacher and student videos—Gemini automatically syncs them by detecting music beats and movement patterns. No manual alignment needed.
  2. Motion analysis
    Extracts frames with MediaPipe pose detection, calculates joint velocities and accelerations, then detects movement events (impacts, peaks, direction changes).
  3. Intelligent comparison
    Matches corresponding events between performances, identifies timing offsets, missed movements, and postural errors.
  4. Visual timeline
    Color-coded sync lines show timing accuracy—green (<100ms), yellow (100–250ms), red (>250ms). Click any moment for side-by-side skeleton comparison with problem joints highlighted.

How we built it

Mainly vibe coding in google AI studio, slightly debug in IDE.

Challenges we ran into

One of the biggest challenges was synchronizing videos that start at different times. In real dance classes, recordings rarely begin together, and frame-by-frame comparison breaks easily when timing is off. Another challenge was handling multi-person videos. Dance classes often include many students, so the system must reliably track only the selected teacher and student without identity switching. We also had to balance technical accuracy with human usability. Raw pose data and numeric errors are hard for learners to understand, so we needed a way to translate motion differences into meaningful, coach-like feedback.

Accomplishments that we're proud of

We built an end-to-end dance analysis system that aligns videos by music instead of timestamps, making comparisons reliable even when recordings are imperfect. The app supports explicit person selection in crowded class videos, which reflects real-world usage rather than idealized demo conditions. Instead of frame-level pose comparison, we implemented motion-based analysis, providing higher-level insights such as rhythm alignment, coordination, and overall movement quality.

What we learned

We learned that timing matters more than precision in human movement analysis. Perfect pose matching is less valuable than aligning motion to music and rhythm. We also learned that good AI products need strong constraints. Explicit rules for person selection, synchronization, and feedback generation helped prevent unstable or misleading results. From a tooling perspective, we learned that Google AI Studio enables rapid iteration when exploring ideas, especially for combining reasoning, audio understanding, and user-facing explanations.

What's next for AI Dancing Coach

Next, we want to expand the system to support longer practice sessions with automatic segmentation, so users can review specific combinations or choreography sections. We plan to improve personalized feedback, adapting suggestions based on a dancer’s history and recurring patterns rather than treating each session independently. Another direction is real-time or near-real-time feedback, enabling users to receive guidance immediately after practice instead of only post-session analysis.

Built With

  • gemini3
  • google-ai-studio
Share this project:

Updates