Inspiration

The Choreosome team came together during Hacking Arts '17 around our common interest in learning and experiencing dance with emerging technologies. Our initial idea of building a shared platform to learn how to dance evolved to its current iteration when we realized the importance of preserving the ethnic and heritage dances of the world. The interface limitations of the difficulties in learning how to dance from a shaky YouTube video could have even more critical consequences when we consider the cultural diversity of the world. Folk dance is an intrinsically human activity that is as universal as humanity itself. We view motion capture technology in this application to be as critical to the preservation of human motion as writing was to oral tradition. It is central to being Human.

What it does

Choreosome is a service that uses Perception Neuron motion capture inputs to run spatial analysis and generate beautiful, immersive experiences that can be viewed in any VR headset. The spatial analysis generates a geometry that has artistic merit in itself, but also allows a user to interpolate between the motions of a dance and learn dance more easily than with any existing technologies. As we record these pieces, future iterations can use an artificial neural network to identify and codify a database of cultural dance moves incorporating not only the technical methods, but also the cultural contexts and geographic spread. This data will be available to a lay user via a mobile app that can help bring this advanced motion capture technology to as many people as possible.

How we built it

Building our product for the Hacking Arts consisted of a diverse set of skills and workflows for the final output. With the Perception Neuron Assets of the motion-captured traditional Chinese dance, we created an animated rigging of the movements in Blender. Traces of the kinematics were exported to Rhino and processed using Grasshopper, an algorithmic modeling tool. In Grasshopper, meshes of the traces were generated using parametric tools which then generated the forms we brought into Unity for the viewing experience. From the consumer side, we prototyped a mobile app for generated content using Sketch and Principle to demonstrate how the future dance genomes ("choreosomes") can be traced in their cultural and geographic history.

Challenges we ran into

After successfully converting the Perception Neuron data into an FBX file for Blender, we had difficulties exporting the animation from Blender and importing it into Unity. We wanted to import it into Unity in order to get it working with a VR headset, so that the user can view the dance poses in 3D space. Due to limited headsets and importing difficulties, we exported the animation as a video instead for Demo Day. For the pose illustrations, we tried a number of ways to abstract the pose to be universal but still identifiable as distinct units. To do this, we had to figure out how to visualize the changing pose parts from the rest of the body.

Accomplishments that we're proud of

We're proud of the vision of this product and the many future directions that Choreosome can go. We're especially proud of its potential impact to help preserve endangered cultures. Motion capture technology is currently reserved for high-end motion graphics and big budget, but we've thought through the delivery of the idea to make sure it's accessible to consumers as well as high-end users.

What we learned

We learned that dance captures movement that reflects the context of our culture, legacy, and humanity. From our cursory research, the interest in tracing the roots of folk dance has already generated interest in research groups across Cypress, Croatia, and Austria.

Technically, we learned how to work with motion capture data to capture curve paths and forms using Blender. Our team had mostly worked with static 3D modeling before, and working with such rich animation data was a first time for us.

What's next for Choreosome

Choreosome hopes to capture these movements and train an artificial neural network to catalog discrete dance moves and trace their cultural lineage in a shared database. By focusing our business model to signing contracts with large-scale institutions like universities and museums, we hope to offer our services as a valuable partnership to existing groups interested in cultural presentation. The private options we will offer could be used by dance or martial arts instructors, businesses, and even the military to store private motion data. We hope to catalog and preserve these motions to bring the human movement of the past to invigorate the human movement of tomorrow.

Built With

Share this project:
×

Updates