The main focus of our project is creating opportunities for people to interact virtually and pursue their interests while remaining active. We hoped to accomplish this through a medium that people are already interested in and providing them with a tool to take that interest to the next level. From these intentions came our project- TikTok Dance Trainer.

In our previous hackathon, we gained experience with computer vision using OpenCV2 in python, and we wanted to look further in this field. Gaining inspiration from other projects that we saw, we wanted to create a project that could not only recognize hand movements but full body motion as well.

What it does

TikTok Dance Trainer is a new Web App that enables its users to learn and replicate popular dances from TikTok. While using the app, users will receive a score in real time that gives them feedback on how well they are dancing compared to the original video. This web app is an encouraging way for beginners to hone dance skills and improve their TikTok content as well as a fun way for advanced users to compete against one another in perfecting dances.

How we built it

To create this project, we split into teams. One team experimented with comparison metrics to compare body poses while the other built up the UI with HTML, CSS and Javascript.

The pose estimation is implemented with an open source pre-trained neural network in tensorflow called posenet. This model can pinpoint the key points on the human body such as wrists, elbows, hips, knees, and joints on the head. The two dancers each have a set of 17 joints, which are then compared to each other, frame by frame. In order to compare these arrays of coordinates, we researched various distance metrics to use such as the Euclidean Metric, Cosine Similarity, the weighted Manhattan distance, and Procrustes Analysis (Affine Transformation). Through data collection and trial and error, the cosine distance gave the best results in the end. The resulting distances were then fed into a function to map the values to viable player scores.

The UI is built up in HTML with CSS styling and Javascript to run its functions. It has a hand-drawn background and an easy-to-use design packed with function. The menu bar has a file selector for choosing and uploading a dance video to compare to. The three main cards of the UI have the reference video and live cam side by side, with pose-estimated skeletons of each in the middle to aid in matching the reference dance. The whole UI is built up in general keeping in mind ease of use, simplicity, visual appeal and functionality.

Challenges we ran into

As a result of splitting into two teams for different parts of the project, one challenge we faced was merging the two parts. It was difficult to both combine the code but as well to connect the different parts of it, returning outputs from one part as acceptable inputs for another. Through perseverance and a lot of communication we managed to effectively merge the two parts.

Accomplishments that we're proud of

We managed to create a clean looking app that performs the algorithm well despite the time pressure and complexity of the project. In addition, we were able to allocate time into making a presentation with a skit to tie everything together.

What we learned

Coming into this hackathon, only one of our members was experienced in web development, but coming out, all of us four felt that we gained valuable experience and insight into the ins and outs of webpages. We learned how to effectively use Node.js to create a backend and connect it with our frontend. Along with this, we gained experience using npm and many of javascript's potpourri of packages such as browserify.

What's next for TikTok Dance Trainer

We also looked into using Dynamic Time Warping to help with the comparison. This would help primarily when the videos were different lengths or if the dancers were slightly mismatched. However, we realized that this would not be needed if the user is dancing against the TikTok video in their own live feed. In the future, we would like to add a functionality that allows two pre-recorded videos to be compared that would then use Dynamic Time Warping.

All open source repositories/packages that were used: link link link link link

+ 5 more
Share this project: