Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

A report issued today estimates that the overall electricity used by servers–computers that make up the networks of organizations, from small businesses to giant financial institutions–doubled between 2000 and 2005. The reasons for this growth rate were an increase in the number of servers installed in data centers and the demands of auxiliary equipment such as cooling fans and facility lighting, says Jonathan Koomey, staff scientist at Lawrence Berkeley National Laboratory and author of the report. “I was surprised by the doubling,” says Koomey. “I expected some growth, but not quite as large.”

In 2005, servers and their auxiliary equipment accounted for an estimated 1.2 percent of all power consumption in the United States and 0.8 percent worldwide, the report states. Koomey says that server power consumption in the United States was the equivalent of that of the entire state of Mississippi in 2005; that year, 20 other states used less power. The report, titled “Estimating Total Power Consumption by Servers in the U.S. and the World,” was funded by AMD and peer-reviewed by the major companies that sell servers, including Intel, IBM, Hewlett-Packard, Sun, and Dell.

Prior to the report, the server and data-center industry hadn’t had an up-to-date, well-researched estimate of server power consumption, says Koomey. The report relies on detailed data from IDC, a research firm, on the number of preexisting installed servers and shipments of servers, as well as on measured data and estimates of the power used per server for the most common models in each class of server.

The report is important to the industry, says Koomey, because once power consumption can be quantified, companies can make better decisions about how to reduce it and save money. “The industry sees that one of the first things you need to do to address the problem is to figure out how big it is,” he says.

Even without this report, companies in the industry have assumed that servers and data centers consume more power than they should, which has prompted the development of technology and practices to curb power consumption. Intel and AMD, for instance, have been touting more-energy-efficient microprocessors during the past year. And for years HP has been working with estimates that closely match the report’s findings, which have been used to inform data-center research at the company, says Chandrakant Patel, HP fellow. “It corroborates our thinking and gives us quantification that might have been lacking before. The good news is, reports such as this … serve as a good reference.”

8 comments. Share your thoughts »

Credit: Intel

Tagged: Computing, energy, IBM, Intel, electricity, HP

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me