Monday, January 29, 2007

Making Love Not Warcraft


South Park's Stan, Kyle, Cartman, and Kenny as World of Warcraft characters in the South Park episode Make Love Not Warcraft.

One of the biggest things to happen to Machinima last year was the use of it to create Make Love Not Warcraft, an episode of South Park that parodied the popular massive multiplayer online role-playing game (MMORPG) World of Warcraft and gamers in general. Machinima.com has an interview with South Park's Frank Agnone, J.J. Franzen, and Eric Stough explaining how Machinima was used to create in-game sequences in the episode.

One of the things that impressed me about Make Love Not Warcraft was the relatively high degree of lip sync in the Machinima sequences, which I didn't realize could be done in a game like Warcraft. Turns out that it can't; Warcraft's publisher Blizzard helped the show cheat a little bit, providing South Park's animators with Maya models of the characters to animate conventionally for close-ups. Even if it's not technically 100% "true" Machinima though, Make Love Not Warcraft is a great example of how Machinima and real-time animation can be incorporated in to a professional production pipeline.

Saturday, January 27, 2007

Bigger (and more expensive) isn't better

While discussing my work thus far on the Panda Puppet system earlier this week, a colleague who I respect and love dearly scoffed at the notion that you could build a decent, high-end digital puppetry system using a free open source program that has a relatively tiny 7 mb installer.

"You need Motionbuilder," he told me. "You need Maya. Don't waste your time on this open source stuff." So we had to agree to disagree, but to underscore my position that bigger (and more expensive) isn't better I direct you to the story of the spoon and the jackhammer.

Note that the point of the story isn't that the spoon is the easier tool to use, just that it's ultimately the better one.

Wednesday, January 24, 2007

Disruptive Animation Technology

The Oscar nominations were announced today and the fact that two of the three nominees for Best Animated Film relied heavily on motion capture has not escaped the attention of many blogs on the internet. Some animators think this is the end of an era.

This isn't really anything new. Anytime a new technique or technology comes along and disrupts an existing art form the old guard get bent out of shape about it for a decade or two until things settle down and a new equilibrium is esthablished. In the 1990s puppeteers griped about the popularity of the then-new computer graphics in special effects work. Traditional animators griped about rise in popularity of 3D animation. Now computer animators are complaining about the rise of motion capture.

This seems silly to me. Photography wasn't the end of painting, electronic devices won't replace books anytime soon and despite the popularity of computer animation, old fashioned stop-motion animation is undergoing something of a renaissance. The tools and process of animation may change, but it will never go away any more than puppetry will.

I have no doubt that when (and I believe it's a matter of when, not if) real-time animation/digital puppetry/Machinima/whatever you want to call it really takes off there will be a whole new round of moaning and groaning about how it will be the end of this artform or that technique, which of course is rubbish. There is always opportunity for smart and skilled artists in these types of transitions. For example, any decent real-time animation system is virtually impossible to put together without good traditional animation skills. Ditto for motion capture, which looks awful in it's raw form.

BTW, work on Panda Puppet this week is coming a long very nicely and much quicker than I expected. I hope to have some demos or screen shots up sometime next week.

Monday, January 22, 2007

Panda Puppet: Day One

Panda Puppet Digital Puppetry Studio

This morning I am sitting down at the computer for my first official day working at Panda Puppet. If you've followed Machin-X for a long time you'll know that Panda Puppet has been in the works for awhile. It grew out of some work I did about a year and a half ago developing a system called Flash Puppet that was used to produce a demo for a Flash Animated children's series in the U.S. When the system is finished, Panda Puppet will be a "studio in a box" that will allow you to produce real-time animation via a puppetry interface. It's being built using Blender, the leading open source 3d modelling and animation software package.

Various work has been going with Panda Puppet part-time off and on for about a year, a lot of it focused on experimenting with different ways to allow puppeteers to control characters in a digital 3D environment. I wasn't expecting to start full-time work on Panda Puppet until sometime in the summer, but as luck would have some basic real-time animation would be very useful for an upcoming project so getting the system up and running is now a priority. The specs for everything are pretty much in place and I think I've got the hardware issues I mentioned last year licked. I'm committed to delivering a proof-of-concept in about eight weeks so I'm going to be very, very busy writing and testing computer code for the next two months.

I'll be blogging my progress on Panda Puppet and explaining more about the system and how it works in the coming days and weeks. Assuming everything gets working properly this is going to be really, really nice to play with.

Saturday, January 20, 2007

How to Puppeteer a Head in Machinima

Lip sync and effective control of a characters' head on screen has been a difficult challenge for Machinma creators. Unless you count the high-end real-time animation being done by Disney and Henson, I haven't seen any Machinima with effective animation of a character's head. There have been some noble attempts, but all of them look pretty stiff and boring to me.

An interesting solution to this problem that I have been toying with for awhile is to have the computer visually track the movement of a puppeteer's hand, much like the way that the Nintendo Wii tracks the movements of players holding a Wii remote. Michael Nitsche, an Assistant Professor at Georgia Tech recently emailed me about PuppetShow, a very cool looking system some of his students have put together to control the heads of digital puppets with the Unreal Game Engine.



As you can see in the video, PuppetShow tracks the coloured object and the digital puppet moves on screen accordingly. This isn't new technology of course, but what I think is really cool about it is how accessible it is...all you need is a coloured piece of paper, a USB webcam and some free software. It's a limited system right now, but I like the way it puts head movement back (literally) in the hand of the puppeteer.

Basic lip sync can be achieved in a system like this through the movement of the puppeteer's thumb, but the puppet's mouth can't enunciate and the results are still pretty stiff. Many Machinima creators have suggested using some kind of voice analysis or lip sync software like Magpie as a way to create lip sync and enunciation, but I like the idea of a puppeteer having direct control over all aspects of the puppet's head.

In conventional puppetry puppeteers do not actually articulate each word that a character says. The puppet's mouth usually only opens once for each syllable that the character speaks; the mouth is open on vowel sounds (a-e-i-o-u) and closed on consonants. Therefore to achieve realistic lip sync a puppet doesn't have to be able to enunciate perfectly, it just has to be rigged to be able to open and close it's mouth and make these four different shapes:



Each of these mouth shapes could be pre-established as a keypose on a digital puppet and triggered by the puppeteer pressing a button (possibly located inside some kind of data glove or control mitt) or making a specific gesture.

By combining this approach to lip sync with the visual tracking of PuppetShow, Machinima creators could really, finally have a great system for puppeteering their characters' heads in real time.

More on Jeff Han's Multi-Touch Interface

Following up on my last post about Jeff Han's incredible multi-touch computer interface, Fast Company Magazine has an article about Jeff in their latest issue. You can read it online at fastcompany.com.

Tuesday, January 16, 2007

Updated: Futuristic Multi-Touch Interfaces



This is old (from last summer), but extremely cool. A prototype multi-touch user interface demonstrated by Jeff Han. At about eight minutes in he demonstrates some potential puppetry applications for it. Think of the possibilities!

Come to think of it, the new iPhone will have a multi-touch interface too. Hmmm...

Wednesday, January 10, 2007

Henson debuts The Skrumps


The Skrumps is a new digital puppetry series from The Jim Henson Company.

The Jim Henson Company is developing a new animated property called The Skrumps, based on a line of colourful, monster-like toy characters created by artist John Chandler. The series is being produced using the Henson Digital Performance System and looks very promising. John Chandler, Skrumps puppeteer Victor Yerrid and Henson's head of Children's Programming Halle Stanford discuss The Skrumps and their development in the latest episode of the Henson.com podcast.

The Skrumps debuted today in four new videos available at Yahoo! Kids for a limited time. Be sure to take a look!

Via The Muppet Newsflash.

--

13/01/07 Update: A behind-the-scenes video about the making of The Skrumps is now available on Henson.com. It's fairly short, but gives a basic overview of how their HDPS works and shows the performers in action.