Skip to Content

Computing’s Power Problem

Tech companies are looking to curb their enormous appetite for electricity.
February 6, 2006

Tech companies are notoriously power hungry. In fact, since data storage has become increasingly important, energy consumption in massive computer rooms – serving companies from Google to Abercrombie & Fitch – has been rising.

So attendees at Sun Microsystems’ summit in San Francisco last week addressed ways to save energy in data centers and large computer server rooms. The gathering included industry leaders such as Hewlett-Packard, Intel, and Advanced Micro Devices (AMD), as well as representatives from Pacific Gas and Electric Company and the U.S. Environmental Protection Agency.

Their goal: to find effective ways to gauge the amount of energy being used and wasted by data centers, and to share technological advances that could help to decrease electricity consumption.

This situation isn’t just a concern for individual companies either. Such a massive use of energy can put a strain on the power grid. Although data centers and server “farms” are relatively small consumers of energy nationwide, there are hotspots – such as Silicon Valley and New York City – where collections of massive servers can drain resources from an already overworked electrical infrastructure.

Because of this, and because energy costs to operate these facilities are rising, companies have started to investigate ways to use less power. Rick Hetherington, distinguished engineer at Sun, explains that in a facility, the processor and memory within each server eat up around half of the power, while the rest goes toward cooling the facility.

Sun and AMD are looking at ways to build more efficient processing units that can complete specific applications quickly, while using the least amount of voltage and producing the smallest amount of heat possible.

Sun uses UltraSPARC architecture that is designed specifically for web-based applications. One of its energy-saving tricks is having one processor can run up to 32 applications, which can consolidate the workload of multiple servers, according to Hetherington. Additionally, he says, the UltraSPARC architecture operates at an energy-saving “clock rate,” which does not require as much power to complete a single task.

Chip-maker AMD is looking to address the energy issue by designing a processing unit that, for one thing, eliminates data “bottlenecks,” according to Brent Kirby, director of marketing. The actual physical arrangement of processors, memory, and input and output devices in a server is critical, he says. Instead of using a traditional approach, which forces bits of data to be consolidated at times into a single pipeline, much like highway traffic merging into one lane, the AMD architecture, which uses the company’s Opteron processors, has a grid-like schematic that allows data to flow more freely to all parts of the unit. And when bits of data don’t stall in bottlenecks, less power is needed to push them through.

Even with more efficient processors, though, a room with racks full of servers can become excessively hot, and heat can hinder processor speed, as well as damage equipment. Such rooms need to be kept cool – and sometimes the solution is surprisingly simple.

“We do physical modeling of the air flow within the server, and we calibrate the system to maximum efficiency,” says Alex Yost, director of product management at IBM. Using these models, IBM engineers strategically place fans, which are less power hungry than standard air-conditioning units, to direct air so that critical components, such as the processors and memory, get the freshest air, Yost says. Of course air conditioners still need to be used, but with the cleverly placed fans, they do not have to run at full tilt.

Sun’s Hetherington points out that Silicon Valley technology companies, including his Santa Clara-based firm, endure brown-outs in the summer – last year it happened at Sun about a half-dozen times. “In our offices in the midafternoon, our lights are dimmed” as a way to conserve electricity, he says. “We’re sitting in the dark – and we’re wondering whether energy-efficient data centers make sense? It couldn’t be clearer to us.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.