Intelligent Machines

Automating Animation

Digital screen stars are becoming more believable-and affordable

Today’s digital movie characters are more realistic, interactive, and endearing than ever before. But creating them is still expensive and labor-intensive. Most are either hand-drawn frame by frame with digital pen and ink (and computers that fill in some gaps), or they are based on “motion capture” techniques, which record and digitally mimic the movements of live actors to create on-screen characters. But as movies and video games pack more and more digital creatures onto the screen, animators are turning to physics and new artificial-intelligence methods for faster, more efficient ways to bring their stories to life.


Physics equations guide NaturalMotion’s characters. (Image courtesy of NaturalMotion)

NaturalMotion, a spinoff of the University of Oxford in the United Kingdom, has developed artificial-intelligence software that gives digital characters the power to animate themselves. The approach: a programmer specifies a character’s physical shape and properties and adds equations to govern the movement of its body parts. When animators apply a simulated force like gravity or a push from behind, the character responds realistically without further programming. “What you see on the screen is not a computer graphic of the character” dumbly mimicking recorded motions, says NaturalMotion’s chief executive, Torsten Reil. “It is the actual character.” The company’s goal is to generate interactive animations in real time, allowing animators to use the technology in video games as well as films, Reil says. At the Computer Graphics Laboratory of Stanford University, researchers Katherine Pullen and Christoph Bregler have combined automation techniques with the traditional motion-capture approach. Using their system, animators can manually draw the most active parts of a character-its legs, say, during a walking sequence-and use archival motion-capture data to automatically complete the body and give it extra texture. “Within five years, all high-end games will use methods like these,” says Casey Muratori, lead developer at RAD Game Tools in Kirkland, WA. Making characters not only move but also emote convincingly requires a different level of technology-software that generates “believable facial expressions and body language,” says Ken Perlin, director of the Media Research Laboratory at New York University. Perlin’s group is developing tools that will make it easy for animators to add layers of subtle gestures, for instance, a raised eyebrow or drooping shoulders. Such effects must be drawn by hand today. Though farther from commercialization, Perlin’s software is a step closer to animation’s larger goal-to dazzle our hearts as well as our eyes.

The latest Insider Conversation is live! Listen to the story behind the story.

Subscribe today
Already a Premium subscriber? Log in.

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly magazine delivery and unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Access to the magazine PDF archive—thousands of articles going back to 1899 at your fingertips

    Special discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.