Inspiration ๐ŸŽต๐Ÿ’ƒ

We were inspired by the growing intersection of music, dance, and technology. Dance is inherently rhythmic, visual, and expressive, and we wanted to create a tool that could help anyone learn choreography interactively. Traditional dance tutorials often rely on watching videos or attending classes, which can feel static or intimidating. We imagined a system that could analyze music, guide movements, and provide instant feedback, combining the fun of gaming ๐ŸŽฎ with the creativity of dance โœจ. Our goal was to make learning choreography more accessible, engaging, and personalized, allowing users to match their moves to the rhythm of any song.

What it does ๐Ÿ•บ

BeatJam is an interactive dance application that allows users to follow along with choreographed moves in real time. Users can upload a song or provide a YouTube link, and the system analyzes the music to identify beats, intensity, and song structure. For each segment of the song, BeatJam selects a dance move that matches the musicโ€™s energy and vibe.

On the main interface, users see:

  • A 3D avatar performing the current move ๐Ÿ’ƒ
  • A live webcam feed with MediaPipe pose landmarks overlaid ๐ŸŽฅ
  • A similarity percentage showing how closely their pose matches the expected choreography โœ…

The app provides real-time feedback through score ratings (OK, GOOD, EXCELLENT) and calculates an overall score at the end of the song. Users can preview upcoming moves, monitor timing with beat detection, and earn badges ๐Ÿ… for high performance.

How we built it ๐Ÿ› ๏ธ

Building BeatJam involved combining music analysis, pose estimation, frontend development, and real-time scoring into a cohesive system.

  1. Music Analysis ๐ŸŽถ
    Using Librosa, we extracted musical features such as BPM, intensity, energy, and segment classifications (verse, chorus). This allowed us to align moves with natural transitions in songs, making choreography feel musically intuitive.

  2. Capturing Dance Moves ๐Ÿคธ
    We recorded team members performing dance moves and used MediaPipe Pose to extract 33 3D landmarks per frame. Key joints included shoulders, elbows, wrists, hips, knees, and ankles. Each move was exported as JSON, creating a structured dataset with metadata like movement style, energy, and pose sequences.

  3. Organizing the Moves ๐Ÿ“‚
    Moves were sorted by vibe and intensity, ensuring transitions felt smooth and consistent across similar song sections.

  4. Frontend Development ๐Ÿ’ป
    Built with React and TypeScript, the interface displays song details, music analysis, the current and next dance moves, and a 3D avatar. A webcam feed shows the userโ€™s real-time pose with a similarity score for instant feedback.

  5. 3D Avatar Visualization ๐Ÿ•ด๏ธ
    The avatar mirrors expected dance moves using MediaPipe landmarks. A smaller preview shows upcoming moves to improve anticipation.

  6. Real-Time Scoring ๐Ÿ†
    The scoring system evaluates joint alignment, movement direction, and presence in frame. Users receive badges for OK, GOOD, or EXCELLENT thresholds. At the end, the system computes an overall performance score.

  7. Integration ๐Ÿ”—
    Backend APIs, Librosa analysis, frontend rendering, avatar animation, and real-time pose comparison were synchronized for a responsive and immersive dance experience.

Challenges we ran into โšก

  • Pose visibility: Users sometimes moved out of frame or occluded joints, which made for weird 3D renders.
  • Synchronization: Aligning moves to variations and tones in the music requires a lot of calculations and precise timing.
  • Accurate Scoring: Trying to score based on matching the users pose to the model was incredibly difficult, as we wanted to make sure the user could be positioned anywhere in the camera frame and still be able to score as long as they are attempting the correct motions

Accomplishments weโ€™re proud of ๐ŸŒŸ

  • Created a fully interactive dance system combining music analysis, avatar animation, and pose scoring
  • Implemented a scoring algorithm that handles missing or occluded joints gracefully
  • Built an intuitive frontend with live webcam overlay, avatar visualization, and badges
  • Designed a system that works with any audio source, including YouTube, and adapts choreography automatically

What we learned ๐Ÿ“š

  • Pose estimation is nuanced: small joint detection errors can affect scoring
  • Music analysis requires flexibility: songs vary widely in tempo, energy, and structure
  • Immediate feedback, visual references, and gamification enhance engagement
  • Integration of frontend, backend, real-time computation, and 3D rendering is challenging but rewarding

Whatโ€™s next for BeatJam ๐Ÿš€

  • Improve scoring using temporal motion patterns for more accurate feedback
  • Add social features like leaderboards, challenges, and collaborative sessions
  • Incorporate advanced avatar customization and AR overlays for an even more immersive experience

Built With

Share this project:

Updates