Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Digital Illumination
Graphics technique allows movie-scene lighting after filming

Results: Researchers led by Paul Debevec at the University of Southern California’s Institute for Creative Technologies have developed computer graphics tools that let filmmakers simulate the live-action lighting conditions of settings that their actors were never in, or add new lighting effects to film they’ve already shot. The researchers previously showed that they could change lighting effects in still images.

Why it Matters: Movie directors use computers to adjust and create visual effects, but for the most part, they can’t tinker with lighting. That means they have to get the lighting just right during filming – a time-consuming and expensive process. The ability to change or re-create lighting after a performance can give filmmakers more flexibility in making the movies they want, while potentially saving time and money on the set.

Methods: The researchers placed an actor inside a spherical structure two meters in diameter that was lined with 156 bright LED light sources. As the actor performed, different lights flashed on and off thousands of times per second, either singly or in groups. A camera filmed the actor at a frame rate equal to the rate at which the lighting changed, so that each frame was lit in a different way, for a maximum of 180 different illumination conditions. The researchers filmed the actor’s head and shoulders, recording up to eight seconds of action; downloaded the information to computers; and used algorithms to select and superimpose different frames to create desired illumination effects.

But there was a problem. Although the actor was filmed at a high frame rate, and the lights flashed just as quickly, the actor still moved appreciably while each of the 180 lighting conditions was being captured. This meant that the position of the actor differed slightly in each frame, so superimposing the frames resulted in smeared images. To solve this problem, the researchers used computer vision algorithms to track and analyze the actor’s facial movements. Based on estimates of how the actor was moving in a given set of frames, they digitally warped the image data to make it look as if each of the 180 frames was taken at the same instant. They repeated this process to produce a set of frames showing the 180 individual lighting conditions for each 24th of a second of the actor’s performance, which they then assembled to produce the final film clip with the computer-generated lighting.

Next Step: The researchers would like to build a larger spherical structure with a greater number of brighter lights that could capture images of an actor’s whole body or of more than one actor at a time. They are also working on finding the best pattern in which to flash the lights on and off so as to obtain the optimum image quality while minimizing the appearance of flickering. – By Corie Lok

Source: Wenger, A., et al. 2005. Performance relighting and reflectance transformation with time-multiplexed illumination. ACM Transactions on Graphics 24:756-64.

0 comments about this story. Start the discussion »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me