Inspiration ๐ต๐
We were inspired by the growing intersection of music, dance, and technology. Dance is inherently rhythmic, visual, and expressive, and we wanted to create a tool that could help anyone learn choreography interactively. Traditional dance tutorials often rely on watching videos or attending classes, which can feel static or intimidating. We imagined a system that could analyze music, guide movements, and provide instant feedback, combining the fun of gaming ๐ฎ with the creativity of dance โจ. Our goal was to make learning choreography more accessible, engaging, and personalized, allowing users to match their moves to the rhythm of any song.
What it does ๐บ
BeatJam is an interactive dance application that allows users to follow along with choreographed moves in real time. Users can upload a song or provide a YouTube link, and the system analyzes the music to identify beats, intensity, and song structure. For each segment of the song, BeatJam selects a dance move that matches the musicโs energy and vibe.
On the main interface, users see:
- A 3D avatar performing the current move ๐
- A live webcam feed with MediaPipe pose landmarks overlaid ๐ฅ
- A similarity percentage showing how closely their pose matches the expected choreography โ
The app provides real-time feedback through score ratings (OK, GOOD, EXCELLENT) and calculates an overall score at the end of the song. Users can preview upcoming moves, monitor timing with beat detection, and earn badges ๐ for high performance.
How we built it ๐ ๏ธ
Building BeatJam involved combining music analysis, pose estimation, frontend development, and real-time scoring into a cohesive system.
Music Analysis ๐ถ
Using Librosa, we extracted musical features such as BPM, intensity, energy, and segment classifications (verse, chorus). This allowed us to align moves with natural transitions in songs, making choreography feel musically intuitive.Capturing Dance Moves ๐คธ
We recorded team members performing dance moves and used MediaPipe Pose to extract 33 3D landmarks per frame. Key joints included shoulders, elbows, wrists, hips, knees, and ankles. Each move was exported as JSON, creating a structured dataset with metadata like movement style, energy, and pose sequences.Organizing the Moves ๐
Moves were sorted by vibe and intensity, ensuring transitions felt smooth and consistent across similar song sections.Frontend Development ๐ป
Built with React and TypeScript, the interface displays song details, music analysis, the current and next dance moves, and a 3D avatar. A webcam feed shows the userโs real-time pose with a similarity score for instant feedback.3D Avatar Visualization ๐ด๏ธ
The avatar mirrors expected dance moves using MediaPipe landmarks. A smaller preview shows upcoming moves to improve anticipation.Real-Time Scoring ๐
The scoring system evaluates joint alignment, movement direction, and presence in frame. Users receive badges for OK, GOOD, or EXCELLENT thresholds. At the end, the system computes an overall performance score.Integration ๐
Backend APIs, Librosa analysis, frontend rendering, avatar animation, and real-time pose comparison were synchronized for a responsive and immersive dance experience.
Challenges we ran into โก
- Pose visibility: Users sometimes moved out of frame or occluded joints, which made for weird 3D renders.
- Synchronization: Aligning moves to variations and tones in the music requires a lot of calculations and precise timing.
- Accurate Scoring: Trying to score based on matching the users pose to the model was incredibly difficult, as we wanted to make sure the user could be positioned anywhere in the camera frame and still be able to score as long as they are attempting the correct motions
Accomplishments weโre proud of ๐
- Created a fully interactive dance system combining music analysis, avatar animation, and pose scoring
- Implemented a scoring algorithm that handles missing or occluded joints gracefully
- Built an intuitive frontend with live webcam overlay, avatar visualization, and badges
- Designed a system that works with any audio source, including YouTube, and adapts choreography automatically
What we learned ๐
- Pose estimation is nuanced: small joint detection errors can affect scoring
- Music analysis requires flexibility: songs vary widely in tempo, energy, and structure
- Immediate feedback, visual references, and gamification enhance engagement
- Integration of frontend, backend, real-time computation, and 3D rendering is challenging but rewarding
Whatโs next for BeatJam ๐
- Improve scoring using temporal motion patterns for more accurate feedback
- Add social features like leaderboards, challenges, and collaborative sessions
- Incorporate advanced avatar customization and AR overlays for an even more immersive experience

Log in or sign up for Devpost to join the conversation.