Skip to Content

A New and Improved Moore’s Law

Under “Koomey’s law,” it’s efficiency, not power, that doubles every year and a half.
September 12, 2011

Researchers have, for the first time, shown that the energy efficiency of computers doubles roughly every 18 months.

Power hungry: The first general purpose computer, ENIAC 1, could perform a few hundred calculations per second.

The conclusion, backed up by six decades of data, mirrors Moore’s law, the observation from Intel founder Gordon Moore that computer processing power doubles about every 18 months. But the power-consumption trend might have even greater relevance than Moore’s law as battery-powered devices—phones, tablets, and sensors—proliferate.

“The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half,” says Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University and lead author of the study. More mobile computing and sensing applications become possible, Koomey says, as energy efficiency continues its steady improvement.

The research, conducted in collaboration with Intel and Microsoft, examined peak power consumption of electronic computing devices since the construction of the Electronic Numerical Integrator and Computer (ENIAC) in 1946. The first general purpose computer, the ENIAC was used to calculate artillery firing tables for the U.S. Army, and it could perform a few hundred calculations per second. It used vacuum tubes rather than transistors, took up 1,800 square feet, and consumed 150 kilowatts of power.

Even before the advent of discrete transistors, Koomey says, energy efficiency doubled every 18 months. “This is a fundamental characteristic of information technology that uses electrons for switching,” he says. “It’s not just a function of the components on a chip.”

The sort of engineering considerations that go into improving computer performance—reducing component size, capacitance, and the communication time between them, among other things—also improves energy efficiency, Koomey says. The new research, coauthored by Stephen Berard of Microsoft, Marla Sanchez, at Carnegie Mellon University, and Henry Wong of Intel, was published in the July-September issue of IEEE Annals of the History of Computing.

In July, Koomey released a report that showed, among other findings, that the electricity used in data centers worldwide increased by about 56 percent from 2005 to 2010—a much lower rate than the doubling that was observed from 2000 to 2005.

While better energy efficiency played a part in this change, the total electricity used in data centers was less than the forecast for 2010 in part because fewer new servers were installed than expected due to technologies such as virtualization, which allowed existing systems to run more programs simultaneously. Koomey notes that data center computers rarely run at peak power. Most computers are, in fact, “terribly underutilized,” he says.

The information technology world has gradually been shifting its focus from computing capabilities to better energy efficiency, especially as people become more accustomed to using smart phones, laptops, tablets, and other battery-powered devices.

Since the Intel Core microarchitecture was introduced in 2006, the company has experienced “a sea change in terms of focus on power consumption,” says Lorie Wigle, general manager of the eco-technology program at Intel. “Historically, we have focused on performance and battery life, and increasingly, we’re seeing those two things come together,” she says.

“Everyone’s familiar with Moore’s law and the remarkable improvements in the power of computers, and that’s obviously important,” says Erik Brynjolfsson, professor of the Sloan School of Management at MIT. But people are paying more attention to the battery life of their electronics as well as how fast they can run. “I think that’s more and more the dimension that matters to consumers,” Brynjolfsson says. “And in a sense, ‘Koomey’s law,’ this trend of power consumption, is beginning to eclipse Moore’s law for what matters to consumers in a lot of applications.”

To Koomey, the most interesting aspect of the trend is thinking about the possibilities for computing. The theoretical limits are still so far away, he says. In 1985, the physicist Richard Feynman analyzed the electricity needs for computers and estimated that efficiency could theoretically improve by a factor of 100 billion before it hit a limit, excluding new technologies such as quantum computing. Since then, efficiency improvements have been about 40,000. “There’s so far to go,” says Koomey. “It’s only limited by our cleverness, not the physics.”

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.