Thursday, May 16, 2013
Faceshift is software that promises "markless motion capture at every desk". It works with consumer-level cameras like the Kinect to track and analyze the facial expressions of a performer and uses them to animate a virtual character in real-time. It also offers the option of recording a performance so that it can be edited and polished in post-production.
There are lots of potential applications for this kind of software in game and film production and, of course, digital puppetry applications!
You can learn more at www.faceshift.com.
Tuesday, May 14, 2013
Hakanaï is one of the more unconventional examples of a digital puppetry performance I've discovered (although, is there anything truly "conventional" about any form of digital puppetry?). Its creators describe it as a "haiku dance performance taking place in a cube of moving images projected live by a digital performer".
The performance involves a dancer performing live, whose movements are tracked in real-time and used as the basis for an interactive, digitally animated environment that is projected around them:
It was created by the French Company Adrien M / Claire B using their proprietary software eMotion. Here's more from their description of the project:
...Performed by an artist as a “digital score”, it is generated and interpreted live. The dancer’s body enters into a dialogue with the moving images in motion. These simple and abstract black and white shapes behave according to physical rules that the senses recognise and to mathematical models created from the observation of nature.
The audience experiences the performance in several stages. They first discover the exterior of the installation. As the dancer arrives, they gather around to watch the performance. When the choreography has ended, the audience can then take some time to wander amongst the moving images.Very cool, no? You can learn more from the video's description on Vimeo.
Through a minimalist transposition, this piece is based on images drawn from the imaginary realm of dreams, their structure and their substance. The box in turns represents: the bedroom where, once the barrier of sleep is passed, walls dissolve and a whole new inner space unfolds; the cage, of which one must relentlessly test the limits; the radical otherness, as a place of combat with an intangible enemy; the space where impossible has become possible, where all the physical points of reference and certitudes have been shaken.
Through the encounter of gesture and image, two worlds intertwine. The synchronicity between the real and the virtual dissolves and the boundary that was keeping them separate disappears, forming a unique space filled with a high oneiric charge.
Sunday, March 31, 2013
Activision unveiled some new real-time rendering technology for human characters at the Game Developers Conference last week.This is the result of several years of research in to creating photo realistic human characters for video games. Although the animation itself a bit off and suffers from the infamous "Uncanny Valley" effect, just on a purely technical level this is pretty impressive.
From the video's description on YouTube:
This animated character is being rendered in real-time on current video card hardware, using standard bone animation. The rendering techniques, as well as the animation pipeline are being presented at GDC 2013, "Next Generation Character Rendering" on March 27.
The original high resolution data was acquired from Light Stage Facial Scanning and Performance Capture by USC Institute for Creative Technologies, then converted to a 70 bones rig, while preserving the high frequency detail in diffuse, normal and displacement composite maps.
It is being rendered in a DirectX11 environment, using advanced techniques to faithfully represent the character's skin and eyes.
More technical details can be found here.
Via Cartoon Brew.
Wednesday, March 27, 2013
A nice example of a digital shadow puppet, made by Luis Leite using Kinect and Unity 3D. To animate the puppet, a human body is tracked in real-time using the Kinect sensor, with one hand controlling the head and the other controlling the tail. The physical movement of the performer's body is remapped on to the virtual shadow puppet using Inverse Kinematics via Unity's Mecanim animation system.
Luis was also responsible for a Kinect-based digital puppet that was mentioned in a post about Kinect-based digital puppetry on Machin-X two years ago.
Friday, February 22, 2013
Earlier this week game development studio Media Molecule gave a R&D presentation at the launch event for Sony's new PlayStation 4 (PS4) video game console, which appears to have some amazing new capabilities like the ability to sculpt, create and animate in real time that offer phenomenal potential for digital puppetry applications. Media Molecule won't too much about what they're working on (yet), but their demo utilizing the PS4 and the often-derided PlayStation Move controller looks amazing (skip ahead to the 5:15 mark to see all the digital puppetry goodness).
Via Puppeteers Unite.
Wednesday, January 30, 2013
Indonesian shadow puppetry has gone digital, so why not Turkish shadow puppetry too?
iKaragoz is an app for iPhone and Android developed by a Turkish firm called Anakule. They are promoting it as the first puppet application for mobile devices, but it's definitely not (several others including iPuppeteer and Pollock's Toy Theatre app have been on the market for years).
The app allows the user to control the characters onscreen intuitively by simply moving their smart phone or tablet.In addition to the traditional Turkish Karagoz puppets, additional packs with characters from Cambodian, Chinese, Indian, Indonesian, Thai and Greek puppetry traditions are also available.
Here's the Wayang Kulit version in action:
iKaragoz was designed by Uğur Doğan with the assistance of Turkish puppeteer Mehmet Saylan. It's available to download from the iTunes Store and Google Play. I haven't had a chance to try it out myself yet, but if someone does please let me know what you think!
Via Puppetry News.
Friday, January 25, 2013
Bryn Oh is the avatar and pseudonym of a professional oil painter who lives here in Toronto who has been creating mesmerizing and challenging multilayered installations inside the virtual world of Second life for several years (see previous post). I'm especially impressed by Bryn's latest work Imogen and the pigeons, an "immersive narrative exhibited in the virtual world called Second Life...a layered story told through poetry."
I find it difficult to classify work like this. Is it Machinima, immersive interactive art, digital puppetry, all of the above, or something else entirely? While I'm not entirely sure what the answer to that question is, I do know that I like it. A lot. It's inspiring to see the innovative ways that Bryn Oh is exploring and expanding how this still new medium can be used.
You can explore Imogen and the pigeons inside Second Life (click here and follow the instructions to join Second Life) and/or see more of Bryn Oh's previous Second Life builds on YouTube.