Inspiration

Dancing on time is one of the most important aspects of dancing. New dancers often have issues following the beat of a song, particularly if it is a waltz song, which is in 3 / 4 time as opposed to the more common 4 / 4 time.

What it does

It uses a beat detection ML model to detect the beats (and downbeat of a song). The mobile app will detect the tempo (in BPM) that you are dancing using accelerometer data. Audio feedback about your dancing is given through Sony's LinkBuds, which are AR audio earbuds that use a ring driver for transparency to allow you to hear ambient sound,

How we built it

Python / FastAPI, Tensorflow / BeatNet, Kotlin / Android / Jetpack Compose, Sony LinkBuds

Challenges we ran into

Accelerometer data Madmom is poorly documented BeatNet has dependency conflicts that need to be manually resolved Google's Voice App Actions backend is currently broken

Accomplishments that we're proud of

What we learned

Filtering accelerometer data

Tensorflow

What's next for DanceBeat

Giving more effective feedbacl

Built With

Share this project:

Updates