More people than ever are setting up impressive home theaters with high-definition plasma displays, Blu-Ray players, and surround-sound speakers. Journey to the Center of the Earth 3-D, opening today, exemplifies Hollywood’s best hope for luring people back to the theater: lots of action, big stars–and the option of full 3-D. But as the first feature-length, live-action digital 3-D film, Journey posed an unprecedented technical challenge.
Today’s 3-D movies are a far cry from those of the 1950s, commonly considered the golden era of 3-D. Directors of modern 3-D films don’t rely as heavily on “punch in the nose” gimmicks; their dual-camera setup relies intensively on real-time error-correcting software; and editors use state-of-the-art image-processing algorithms to remove artifacts of the stereoscopic filming process, says John Lowry, founder of Lowry Digital, in Burbank, CA, the company that digitally enhanced Journey before it hit theaters.
In addition, Lowry says, the modern projectors, screens, and glasses used in theaters have improved the 3-D experience by reducing jitter–the headache-causing motion difference between the pictures for the right and left eyes. “It’s just much easier to watch 3-D today,” he notes.
Most modern 3-D feature-length films have been animated, so they’ve allowed for a lot of computer modification of the shots. But for a live-action film, says Vince Pace, founder of PACE, the Burbank company that supplied the cameras for the film, there’s the added challenge of making the filming process as invisible as possible to the actors. In other words, he says, the technology has to work: cinematographers can’t be fiddling around with the camera when the action has started.
The first consideration is, of course, the camera. Stereoscopic technology has been around for more than a century, and a lot of the basic tricks are well known, Pace says. “Seventy percent of the [3-D camera] equation has been known for some time: you use two cameras, duct-tape them together, and there you go,” he says. “But 30 percent is subtleties.”
Pace explains that for a stereoscopic camera setup, the two lenses should be about 2.5 inches apart, roughly the distance between a person’s eyes. The left camera collects information for the left eye, and the right camera for the right eye. But the lenses of two separate cameras can’t be put much closer together than six inches due to their physical enclosures. The workaround that PACE and many other 3-D filming companies use is to shoot directly through one lens but use a mirror about 2.5 inches away to direct an offset image onto a second lens. The reflected image needs to be inverted and flipped before the film is edited.
The next trick, Pace says, is to make sure that the cameras communicate with each other so that the images they capture aren’t radically different in terms of zoom or focus, for instance. The cameras used for Journey were essentially networked, he says, with specialized software that monitored the input of both and dynamically adjusted them so that they matched each other. The software controls nine parameters: the zoom, focus, and aperture (the amount of light admitted) of each camera; the framing of the zoom function, so that it’s the same for both cameras; and the relative angle of the cameras.
Once Journey was shot, it was essentially two separate movies: one for the right eye, and one for the left, says Lowry, who cleans up some of the artifacts that inevitably occur when shooting in 3-D. Because one camera is shooting directly, and the other is shooting from a reflection off a mirror, the images aren’t equivalent. The reflected image has lost a little light and is thus of slightly lower resolution than the direct image, Lowry says. Using the horsepower of about 720 computers specialized for processing imagery, Lowry Digital adds resolution to the movie file that contains the reflected images. Lowry explains that this is done by extracting information from multiple frames. “You can find detail that’s hidden in the grain or noise of a camera,” he says.
Electrical noise is another important consideration–especially under low-light conditions, when the signal from a camera’s digital sensors is weak. Lowry says that the footage from Journey had a lot of noise because in a number of scenes, the main source of light was the headlamps on the characters. If part of a scene has the graininess caused by noise, while another, well-lit part is crisp, the 3-D effect can be lost or, worse, annoying. To solve this problem, the image-processing software again adds resolution to frames whose darker areas might be too grainy.
For all the apparent trouble of making a 3-D movie, the industry is investing in the technology. Lowry notes that this year, there have been six 3-D movies, and next year, about 17 3-D releases are planned. “It’s in a state of growth that’s quite remarkable,” he says.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.