Saturday, January 20, 2007

How to Puppeteer a Head in Machinima

Lip sync and effective control of a characters' head on screen has been a difficult challenge for Machinma creators. Unless you count the high-end real-time animation being done by Disney and Henson, I haven't seen any Machinima with effective animation of a character's head. There have been some noble attempts, but all of them look pretty stiff and boring to me.

An interesting solution to this problem that I have been toying with for awhile is to have the computer visually track the movement of a puppeteer's hand, much like the way that the Nintendo Wii tracks the movements of players holding a Wii remote. Michael Nitsche, an Assistant Professor at Georgia Tech recently emailed me about PuppetShow, a very cool looking system some of his students have put together to control the heads of digital puppets with the Unreal Game Engine.



As you can see in the video, PuppetShow tracks the coloured object and the digital puppet moves on screen accordingly. This isn't new technology of course, but what I think is really cool about it is how accessible it is...all you need is a coloured piece of paper, a USB webcam and some free software. It's a limited system right now, but I like the way it puts head movement back (literally) in the hand of the puppeteer.

Basic lip sync can be achieved in a system like this through the movement of the puppeteer's thumb, but the puppet's mouth can't enunciate and the results are still pretty stiff. Many Machinima creators have suggested using some kind of voice analysis or lip sync software like Magpie as a way to create lip sync and enunciation, but I like the idea of a puppeteer having direct control over all aspects of the puppet's head.

In conventional puppetry puppeteers do not actually articulate each word that a character says. The puppet's mouth usually only opens once for each syllable that the character speaks; the mouth is open on vowel sounds (a-e-i-o-u) and closed on consonants. Therefore to achieve realistic lip sync a puppet doesn't have to be able to enunciate perfectly, it just has to be rigged to be able to open and close it's mouth and make these four different shapes:



Each of these mouth shapes could be pre-established as a keypose on a digital puppet and triggered by the puppeteer pressing a button (possibly located inside some kind of data glove or control mitt) or making a specific gesture.

By combining this approach to lip sync with the visual tracking of PuppetShow, Machinima creators could really, finally have a great system for puppeteering their characters' heads in real time.

No comments: