Inspiration

Movement instructors often face considerable challenges when creating tutorials, particularly when it comes to capturing the essence and intricacy of movements. The traditional Mocap technology, while effective, presents numerous difficulties including the high cost of hardware, time-consuming setups, and the need for extensive post-processing. Additionally, a solid understanding of complex technical tools like Blender and Unity is required, which can be a significant barrier for many movement educators.

Problems Identified

  • High Costs and Accessibility: Mocap hardware is expensive, limiting accessibility for many body-movement instructors.
  • Time Consuming: Setting up mocap systems takes time and is not conducive to the dynamic needs of body-movement tutorials.
  • Technical Complexity: Processing mocap recordings requires advanced software skills and knowledge of tools like Blender and Unity.
  • Social Pressure and Social Anxiety: Traditional learning environments can induce social pressure and anxiety, deterring individuals from fully engaging in body movement activities.

What it does

An interactive way to practice body movements like dance and martial arts has been developed, providing a welcoming environment free from social pressures. This platform allows users to freely express themselves, enhancing learning and personal expression without the usual constraints.

How we built it

Our project introduces a solution that allows for the recording and playback of body-movement movements in real time. This system simplifies the data capture process by storing body-movement data in JSON format and supports a broad spectrum of movement forms, including those that involve props or elements like fire, as seen in the Fire Dance prototype.

Key Features

  • Mirroring avatar: Mirrors and records user’s movements with body tracking.
  • Music Library: Provides a diverse collection of tunes suitable for various movement forms.
  • Interactive Replay and Review: Enables user to record, replay, and review performances with adjustable duration and frame rates.
  • Step-by-Step Learning: Facilitates learning body-movement in manageable steps with the ability to freeze frames and check movements against a threshold pose value for accuracy.
  • Movement accuracy check: Create body detection in sequence with fun interactivity.
  • Mocap data export: JSON, makes it easy to expand tutorial library to more diverse practice type.

Implementation

During the development and usage phases, the system allows users to engage with a rich music library, record body movements, and adjust playback settings such as duration and frame rates. The collected data is then stored in a JSON format for ease of use and accessibility.

Challenges we ran into

  • Body tracking Rigging for Mirroring avatar won’t get right proportion easily.
  • Link Cable.

Accomplishments that we're proud of

  • We are able to make Movement detection checks in Sequence which haven’t been done before
  • JSON export. Other than users, developers can use it to demonstrate movement across platforms.

What we learned

  • Movement SDK.
  • How to make custom avatar for body tracking work.

What's next for Move with Me

  1. Expansion of Body-movement Library:Enhancing the diversity of body movement practices by collaborating with professional dance artists of different backgrounds.Ex : K-pop, Tai chi etc;
  2. Social Platform: Allowing dancers of various levels to upload and share their dance moves to connect and collaborate with people with similar interests globally.
  3. Community and Events: Organizing events to reach new users and get feedback.
  4. Incorporation of a Scoring System: Introducing a score counter to provide feedback and incentivize improvement for modular learning.
  5. Enhancement and Polishing: Refining the interface and user experience to ensure seamless interaction.

Built With

Share this project:

Updates