Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

High-definition displays are increasingly popular. More and more people are experiencing high-definition movies and television in breathtaking color and detail. But another technology, called high-dynamic range (HDR), is on the heels of high definition, and some experts think that it could be a quick successor. Whereas high-definition displays pump out more pixels, HDR displays provide more contrast. In other words, on an HDR display, the brightest whites are hundreds of thousands of times brighter than the darkest blacks; the contrast is key to making images on such a display appear more realistic. “A regular image just looks like a depiction of a scene,” says Roland Fleming, a research scientist at the Max Planck Institute for Biological Cybernetics, in Tübingin, Germany. “But high-dynamic range looks like looking through a window.”

Fleming, whose recent research on high-dynamic displays is being presented at SIGGRAPH, a graphics conference held this week in San Diego, suspects that this realism will draw people to the technology. And recently, manufacturers have started to pay attention to HDR. Major companies such as Phillips and Samsung have demonstrated prototypes at trade shows. Jason Ledder, a representative for Samsung, says that the company is “doing a variety of research and trying to figure out when and where to incorporate [HDR] into products.”

Earlier this year, Dolby bought BrightSide Technologies, a startup based in British Columbia that developed a novel HDR display capable of four hundred times more contrast than a conventional monitor–closer to what the human eye can perceive. While a traditional liquid-crystal display is illuminated by a single white backlight, a BrightSide display is illuminated by an array of tiny white light-emitting diodes (LEDs). This means that individual LEDs can be turned off or on, increasing the darkness or brightness to various parts of the liquid-crystal display. Neither Dolby nor the other companies are providing specific timelines for a product, but Fleming has heard reports that displays could be available, for a few thousand dollars, within a year.

One of the problems with introducing a new type of display, however, is overcoming the perception that there won’t be any content that will take advantage of its potential, Fleming says. This is something that has plagued the market for high-definition displays: many people are waiting to buy a high-definition TV until there is more content, and providers are slow to churn out high-definition content until more people have the displays. Many experts believe that the same issue could be a challenge regarding HDR products.

However, the research by Fleming and his colleagues at the University of Bristol, in the UK, and at the University of Central Florida suggests otherwise. “The key questions that everyone’s been raising,” he says, “are how [HDR] is going to make the transition and how it is going to show a regular image.” Usually, he says, regular images can be processed using difficult-to-engineer software that adds contrast. His team’s original plan was to determine people’s perception of contrast on HDR displays to see how much extra information needs to be added to a regular image to make it appear as an HDR image on an HDR display. To the researchers’ surprise, says Fleming, they learned that they didn’t need complicated software at all. They surveyed people viewing low-contrast and high-contrast images, both on an HDR display. When the low-contrast images were processed with simple software that amplified pixels, the images were perceived as high contrast. In fact, Fleming says, the average person couldn’t tell the difference between the low- and high-contrast images, and all the images looked significantly better than they would have on a regular display.

11 comments. Share your thoughts »

Credit: Erik Reinhard

Tagged: Computing, lasers, LED, television

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »