What it does

Note: Come to the right (when looking towards the screen) side of the room in the third room and talk to the guy with the crazy keyboard if you want to see it in action.

The current demo replaces Rick Astley's face in the "Never Gonna Give You Up" music video with my face. The warping works properly so that my mouth opens and closes with his mouth and my face gets turned as his head turns, it's a bit glitchy though.

How I built it

It processes videos and tracks faces and a bunch of key points on both a sample image and all the frames of a video. It then outputs a file for a given video detailing where all the faces are and where the face is in the sample image. You can then visit a page and have the video play, but with the face from your sample image warped over the faces in the video.

Inspiration

I wanted to do something using face tracking, and it had to be terrible because it's for the TerribleHack hackathon.

Challenges I ran into

I had to wrangle C++ libraries into parsing video and outputting data as well as learn how to use Three.JS to display the result.

Accomplishments that I'm proud of

It actually kinda works!

What I learned

I ended up reading a ton of computer vision papers before the hackathon so that I knew what facial feature tracking system to use and how to make it work. I had to create a delauney triangulation of the facial points so that I could properly texture warp it. I also had to learn Three.JS.

What's next for FaceHack

An interface for uploading your own face image and having it processed by the server instead of only working with my face.

Built With

+ 1 more
Share this project:
×

Updates