Inspiration

Agastya, being a man of many Disney Princesses, has always wanted to show his passion for Disney film. What better way to feel closer to your favourite princesses than to animate them over your faces in real time using a variety of interesting techs.

What it does

Disneyfication is an animation tool designed to speed up the general animation and storyboarding process. The character you choose follows a variety of your facial motions at 10 frames per second. Skype webcam streaming is also supported.

How I built it

There are two primary components to Disneyfication. The first component is the "Animation" pass, where we extract crucial animation data from the face in order to get the general overall position of the varying face pieces. The second component is the "Sentiment" analysis pass, where we extract emotional data from the picture via a deep recurrent neural belief network, and attempt to have the character mimic the user's emotional expressions.

Challenges I ran into

The main challenge we ran into was the training and later loading of our deep neural network. This particular problem took me nearly 20 hours to solve, working around the clock. In the end we were able to get it to work, but only with mere hours left on the clock.

Accomplishments that I'm proud of

When training the network, we had to use over a few hundred concurrent linodo cloud virtual machines in order to insure that the network would be trained by time the hackathon finished. Setting up this odd compute cluster was one of the weirdest things I've ever had to do in a hackathon, and was definitely interesting

What I learned

Various OpenCV methods for detecting faces and motion

What's next for Disneyfication

We can greatly improve our emotional classification model by adding more classes, or perhaps procedurally generating classes as interpolations between others. Also training time was rather bad, so perhaps a reduction step before training would help.

Built With

Share this project:
×

Updates