Inspiration

The inspiration for the project is a result of the longing for one of our members to get better at dancing and to find a way to incorporate the themes of the event, wearable technology to solve real problems that we face in our lives and make them that much better.

As a result, after the opening ceremony was over, our team spent the next hour trying to list out all of the possible ideas that solve some sort of problems that each of our individual team members face in their real lives. Afterwards, each team member allocated a score based on the rubric for judging using a weighted decision matrix to determine the project that the team will eventually work on and the Haptic Dance Application ended up coming ahead.

What it does

Dantic is a wearable technology used to provide novice dancers with visual and tactile feedback to their dance moves. Users find dance choreography videos online and paste them into Dantic where they are then analyzed and converted into dance routines. Users are then able to wear the Dantic wearable and play these routines. The computer will track the user’s movements and compare them to the ones in the videos. The wearable(wrist strap) reflects the accuracy of the user to the video by providing an haptic and visual feedback to the user. Haptic Feedback is from the servo motors. Visual feedback is from an RGB LED. (Blue = Out of frame, RED = Bad Dance, YELLOW = okay dance, Green = perfect dance)

How we built it

We took advantage of our team's wide range of skills which enabled us to divide and allocate tasks based on everyone’s greatest strengths. Through this process, we established several crucial components: Youtube to mp4 conversion. Vital for starting the analysis process which is detailed in the next step. Pose landmark integration. We used Mediapipe’s Pose landmark detection to detect and record where a dancer’s arms and legs are in the mp4 video converted from Youtube.

Live comparison. Mediapipe was used again to detect the arms and legs of the user in real time. Dantic then compares the user’s positions to the video’s positions and calculates an error value based on the distance between both positions. This and all previous components were built on Streamlit to establish a web interface for the application.

Hardware component. We used the Node MCU in conjunction with servo motors for haptic feedback and LED’s for visual feedback. We then set up a UDP communication protocol on the NodeMCU’s webserver. This protocol enabled communication between the software and MCU to control the wristband modules.

Challenges we ran into

Over the course of the hackathon, the team ran into a few challenges that allowed us to experience the nuances of integrating hardware, software, and mechanical components together within a short period of time and use effective team communication to ensure we were always letting each other know of any blocking points. Some of the challenges faced by the team included:

  1. Hackathon Constraints - Given the relatively short timeline for a Computer Vision project that integrates serial communication for haptic feedback, we are unable to physically solder the wiring for the vibration motors and the LEDs that emulate the haptic feedback for the user, resulting in a slightly lower-scoped solution than we had hoped. Moreover, the credit system forced us to be more conservative with our spending and limited our scope. Furthermore, we had missed the soldering time frame which became a massive time sink to resolve once we needed the solder.

  2. Short Time Frame - 24 hours has been the shortest hackathon length any of us had participated in and the reduced time posed to be a challenge, however, we were very proud and satisfied with ourselves after ending up with a solid and functional project which met the scope that we established at the beginning of the project

Accomplishments that we're proud of

We were proud about what we were able to achieve in such a short amount of time because the project was a very ambitious one, involving different technologies that we have never worked with before.

Our github repository can be found at the following link - link

What we learned

We learned how to use Mediapipe’s Python package to analyze images, videos, and live footage for people and approximating thor poses as a node graph.

Using the NodeMCU, we learnt how to set up a UDP communication protocol over wi-fi.

Additionally, we also learned how to use Streamlet to use web apps to easily visualize and deploy our algorithms to see results in real time.

What's next for Dantic: The Ultimate Dance Feedback Experience

Add haptic and visual feedback modules for the legs too Calculate and record scores to easily gauge improvement Provide specific advice to the user on what to improve on Calories burnt calculator Gamify the entire experience with an android app

Built With

Share this project:

Updates