Two men who invented game-changing 3D computer graphics techniques now widely used in the film industry have won the highest distinction in computer science: the Turing Award. If you enjoyed Toy Story, The Lord of the Rings, Finding Nemo, Titanic, Avatar, or Jurassic Park, you have them to thank.
Who are they? Edwin Catmull and Patrick Hanrahan. Catmull cofounded Pixar and hired biophysics PhD Hanrahan as one of the first employees in 1986. Hanrahan spent much of his time modeling materials and lighting to help animations look closer to real life. “Physicists generally don’t study hair or skin, and why they look the way they do. I did, and spent years thinking about how to get things like lighting right,” he told MIT Technology Review.
Their work: Hanrahan was the lead architect of the team that created the complex software known as RenderMan, which lets filmmakers turn images into photorealistic animations that can be blended with real-life scenes. Rendering determines which computer-generated images are visible on the screen for every frame, assigns them colors, and draws them. It brings an animation “to life.” In 2001 RenderMan became the first piece of software to win an Oscar. It has been used in 44 of the last 47 films nominated for an Academy Award in the Visual Effects category. Hanrahan later returned to academia and is now a professor at Stanford University.
Catmull’s contribution: From 1970, Catmull was part of the University of Utah’s ARPA program, where he came up with the first method to display curved surfaces on a computer. Up to that point, computer-generated images were all straight lines and polygons. While at Utah in 1972, Catmull created a short film called “A Computer Animated Hand,” which is one of the earliest examples of computer animation.
It took a long time for the industry to fully wake up to the potential of what he’d invented. The world's first computer-animated feature film, Pixar’s Toy Story, didn’t come out until over two decades later, in 1995. Catmull invented other groundbreaking animation techniques, too: Z-buffering, used to determine which parts of an object are and aren’t visible on screen, and also texture mapping, which adds realism to computer-generated graphics.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
IBM wants to build a 100,000-qubit quantum computer
The company wants to make large-scale quantum computers a reality within just 10 years.
Multi-die systems define the future of semiconductors
Multi-die system or chiplet-based technology is a big bet on high-performance chip design—and a complex challenge.
The inside story of New York City’s 34-year-old social network, ECHO
Stacy Horn set out to create something new and very New York. She didn’t expect it to last so long.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.