Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Researchers have, for the first time, shown that the energy efficiency of computers doubles roughly every 18 months.

The conclusion, backed up by six decades of data, mirrors Moore’s law, the observation from Intel founder Gordon Moore that computer processing power doubles about every 18 months. But the power-consumption trend might have even greater relevance than Moore’s law as battery-powered devices—phones, tablets, and sensors—proliferate.

“The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half,” says Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University and lead author of the study. More mobile computing and sensing applications become possible, Koomey says, as energy efficiency continues its steady improvement.

The research, conducted in collaboration with Intel and Microsoft, examined peak power consumption of electronic computing devices since the construction of the Electronic Numerical Integrator and Computer (ENIAC) in 1946. The first general purpose computer, the ENIAC was used to calculate artillery firing tables for the U.S. Army, and it could perform a few hundred calculations per second. It used vacuum tubes rather than transistors, took up 1,800 square feet, and consumed 150 kilowatts of power.

Even before the advent of discrete transistors, Koomey says, energy efficiency doubled every 18 months. “This is a fundamental characteristic of information technology that uses electrons for switching,” he says. “It’s not just a function of the components on a chip.”

The sort of engineering considerations that go into improving computer performance—reducing component size, capacitance, and the communication time between them, among other things—also improves energy efficiency, Koomey says. The new research, coauthored by Stephen Berard of Microsoft, Marla Sanchez, at Carnegie Mellon University, and Henry Wong of Intel, was published in the July-September issue of IEEE Annals of the History of Computing.

16 comments. Share your thoughts »

Credit: U.S. Government / Public Domain

Tagged: Computing, computers, Moore's Law, power efficiency

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me