Recently, a number of filmmakers converged on the campus of the California Institute of Technology. They were there to show of the motion-capture tech behind Rise of the Planet of the Apes, the reboot/prequel set to hit theaters on August 5th. And what they had to show was remarkable.
The original Planet of the Apes films, and the subsequent Tim Burton misstep, were all notable for their use of makeup and prosthetics to portray the eponymous apes. But Rise is a very different story–instead of taking place well in the future, it takes place in present-day San Francisco. And instead of featuring the decidedly humanoid apes who rule the planet ages hence, the apes in question for this film are, well, actual apes–apes subjected to cruel experiments who then lead a revolution against their human oppressors.
As the film’s director Rupert Wyatt recently explained, “There was no way we could put actors in… simian suits and pull it off.” That left two choices: real apes, or motion-capture performances. For a story about the cruel misuse of apes, the irony and hypocrisy of the former was too unbearable. And fortunately for Wyatt, in the wake of Lord of the Rings and Avatar, motion-capture technology has ushered in a new era of digitally-assisted acting. Wyatt’s team contracted WETA Digital, the group co-founded by Rings director Peter Jackson, to handle the special effects on the new Apes movie. And, in another smart move, he tapped Andy Serkis, who knows a thing or two about embodying simians (he played the title role in Jackson’s King Kong) to play the role of Caesar, the beleaguered ape who leads the primate rebellion in the new film.
After years of playing CGI characters–Serkis was also the man behind the Rings series’ Gollum–Serkis has become quite adept at explaining the technology behind his performances. A few months ago, he gave a tour of the Vancouver set to Popular Mechanics’s Erin McCarthy, explaining how the technology relates to the markers placed all over his body as he lopes around the stage:
“The performance-capture cameras track these markers, like coordinates for your joints. You’re basically puppeteering an electronic skeleton in real time—you can see the simplified CG ape puppets moving in absolute synchronicity with our performances, and we use the real-time playback to check what we’re doing. The markers on our faces are tracked by these head-cams, which have tiny little LED lights all around them, in great detail.”
WETA Digital’s technology has evolved over the years to enable a few “firsts” for the new Apes film. Rings brought us the first fully emotive digital character, and Avatar marked the first time those characters could be rendered in real time for the director to see. But with Apes, for the first time, according to The Seven Sees, “the performance capture and live-action sequences [are filmed] at the same time.” What’s more, point out several outlets, it’s the first time mo-cap has been sophisticated enough to confidently move outside, for massive exterior shoots beyond the tightly-controlled soundstage. As Serkis told Total Film, “Basically this film represents one of the first and biggest examples of having multiple [performance capture] actors on a live-action set…. The Golden Gate sequence must be a world record in terms of the size of the capture area.”
The pre-release buzz, the conference at CalTech, and the few moody clips that have emerged all point to one question: will this be the year an actor finally gets an Oscar nomination for breathing life into a digital creature? It’s a debate this Serkis’s Gollum kicked off a decade ago–and it’s a debate The Hollywood Reporter thinks we could be in for again soon. “It’s no different than live action acting,” Serkis told the paper. “And I never considered [it] anything else but live action acting.”
If the Academy finally agrees, it will be the technology, in the end, that helped Serkis make his case.
Embracing CX in the metaverse
More than just meeting customers where they are, the metaverse offers opportunities to transform customer experience.
Identity protection is key to metaverse innovation
As immersive experiences in the metaverse become more sophisticated, so does the threat landscape.
The modern enterprise imaging and data value chain
For both patients and providers, intelligent, interoperable, and open workflow solutions will make all the difference.
Scientists have created synthetic mouse embryos with developed brains
The stem-cell-derived embryos could shed new light on the earliest stages of human pregnancy.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.