Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Researchers at Intel have developed an algorithm that, by leveraging the power of multiple microprocessors, can boost the resolution of a video as it plays in real time. The technology, called super resolution, can run on machines with as few as two cores and as many as hundreds, potentially letting people enhance video captured with a cheap webcam, improve old home movies, or turn a DVD-quality video into a high-resolution flick.

Intel’s super-resolution research is part of the company’s push to find the best applications to run on its multicore machines, says Jerry Bautista, codirector of Intel’s tera-scale computing research program. While multicore computers–machines with more than one processing core–are currently available to consumers in dual- and quad-core varieties, Intel has a research-grade microprocessor with 80 cores. (See “The Promise of Personal Supercomputing.”) And as researchers get closer to their goal of achieving tera-scale computing on desktop computers–in which trillions of calculations per second are enabled by massively multicore systems–the company is ramping up its software research; improving video quality using multicore machines is one of the top priorities on Intel’s to-do list, says Bautista.

To be sure, the chip maker isn’t the first to explore the idea of adding resolution to video. Super-resolution theory dates back to the 1980s, says Peyman Milanfar, a professor of electrical engineering at the University of California, Santa Cruz. But in the early days, the algorithms just didn’t work well, and the computing power wasn’t there to process the videos quickly. In 2003, Milanfar and his group developed computationally efficient algorithms that were able to improve the resolution for most video, although not in real time. Indeed, Milanfar’s approach has been the basis of other research by academics and companies.

Super-resolution algorithms upgrade video in two main steps, explains Oscar Nestares, senior research scientist at Intel. First, the algorithm examines pixels in the video frames to see how fast each pixel is moving and in which direction. For instance, if a car is moving down a street, the pixels that compose it will all be moving in a predictable way.

The data collected in the first step is then used to estimate the movement of new pixels that are added to increase the video resolution. The result is a cleaner video that appears to be captured at the same time as the original. “We’re trying to get information that’s not there between frames,” says Intel’s Bautista. “The only way we can do this is if we collect lots of data and make better educated guesses at what those intermediate pixels should be.”

Bautista explains that one thing that differentiates Intel’s super-resolution algorithm from others is its ability, in real time, to generate what’s known as a robust result. This means that the algorithm is able to toss out any erroneous pixels that could be a result of electrical noise in the sensor or dust on the lens, for instance. These erroneous pixels tend to lead to inaccurate guesses and a video that isn’t true to reality.

8 comments. Share your thoughts »

Credit: Technology Review

Tagged: Computing, Intel, video, microprocessor, multicore, HD video

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me