Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

In July, Koomey released a report that showed, among other findings, that the electricity used in data centers worldwide increased by about 56 percent from 2005 to 2010—a much lower rate than the doubling that was observed from 2000 to 2005.

While better energy efficiency played a part in this change, the total electricity used in data centers was less than the forecast for 2010 in part because fewer new servers were installed than expected due to technologies such as virtualization, which allowed existing systems to run more programs simultaneously. Koomey notes that data center computers rarely run at peak power. Most computers are, in fact, “terribly underutilized,” he says.

The information technology world has gradually been shifting its focus from computing capabilities to better energy efficiency, especially as people become more accustomed to using smart phones, laptops, tablets, and other battery-powered devices.

Since the Intel Core microarchitecture was introduced in 2006, the company has experienced “a sea change in terms of focus on power consumption,” says Lorie Wigle, general manager of the eco-technology program at Intel. “Historically, we have focused on performance and battery life, and increasingly, we’re seeing those two things come together,” she says.

“Everyone’s familiar with Moore’s law and the remarkable improvements in the power of computers, and that’s obviously important,” says Erik Brynjolfsson, professor of the Sloan School of Management at MIT. But people are paying more attention to the battery life of their electronics as well as how fast they can run. “I think that’s more and more the dimension that matters to consumers,” Brynjolfsson says. “And in a sense, ‘Koomey’s law,’ this trend of power consumption, is beginning to eclipse Moore’s law for what matters to consumers in a lot of applications.”

To Koomey, the most interesting aspect of the trend is thinking about the possibilities for computing. The theoretical limits are still so far away, he says. In 1985, the physicist Richard Feynman analyzed the electricity needs for computers and estimated that efficiency could theoretically improve by a factor of 100 billion before it hit a limit, excluding new technologies such as quantum computing. Since then, efficiency improvements have been about 40,000. “There’s so far to go,” says Koomey. “It’s only limited by our cleverness, not the physics.”

16 comments. Share your thoughts »

Credit: U.S. Government / Public Domain

Tagged: Computing, computers, Moore's Law, power efficiency

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me