Wednesday, September 21, 2011

Facial Puppetry: Face tracking and replacement



I've been casually doing some research in to face tracking and face substitution techniques for a film I'm currently directing and producing. I'm exploring the idea of blending conventional "Bunraku-style" puppetry techniques with real-time animation. What I'm hoping to be able to do is replace the face of a physical puppet with an animated one (preferably in real-time, although I might have to settle for some kind of post-production process).

The basic concept is to take a person’s expressions and map them in real-time to either a digital model of a face or match the expression with a photo of another face in an image database. There are a number of people doing some interesting work in this area, especially Jason Saragih who has created the FaceTracker library; you can see an example of what it can do in the video embedded above (I mentioned this briefly in my previous post).

This isn't really a new idea. Companies like ILM and Rhythm and Hues pioneered similar post production techniques back in the 1990s and they've been used for years to create talking animals in commercials, movies and TV shows. There has also been similar commercial software like CrazyTalk around for several years. What is relatively new is that the technology now works in real-time and is accessible to anyone with a decent computer and some basic programming knowledge.

Technology like this further blurs the line between puppetry and animation (which has been happening for awhile now) and offers artists the chance to have the best of both worlds. I really love the idea of being able to endow conventional puppets with a level of expression that just isn't possible in a physical puppet. Exciting stuff.

No comments: