Skip to Content

Rethinking Energy Use in Data Centers

Some radical rethinking will be needed to cut the excessive use of power in data centers, say computer scientists.

A couple of years ago the US Environmental Protection Agency reported that the energy consumption associated with data centers had doubled between 2000 and 2006, reaching some 60 billion kWh in 2006, roughly 1.5 per cent of the entire US energy use. The EPA says this is expected to double again by 2010.

The report triggered a flurry of interest in ways to reduce consumption. However, Stavros Harizopoulos from HP Labs in Palo Alto and buddies say that almost all the attention has focused on hardware fixes. At the chip level, this means things like dynamic voltage and frequency scaling (DVFS), clock routing optimizations, low-power logic and asymmetric multi-cores. At the platform level they’ve suggested things like dynamically turning off DRAM, disk speed control and disk spin down.

But what of software fixes? Harizopoulos and co say far less work has been done in this area, partly because there are limited ways in which programmers can control the power hungry process that go on in silico.

But the team says there are still many was that database managers can optimise their energy use and give several examples, such as designing algorithms for energy performance. That might mean carrying out scans on uncompressed data rather than compressed data, which Harizopoulos and co have calculated is more energy efficient.

In fact the whole issue of data compression will need re-examining, they say. Data compression trades CPU cycles for lower bandwidth, which has always seemed a bargain. But if you add energy use into the mix, the reasoning changes since CPU cycles can be more power hungry.

It’s this kind of green thinking that Harizopoulos and co want to promote with their paper, which has lots of other ideas.

That could make for some fairly intensive work for managers of data centres but it could lead to substantial savings. Better get working

Ref: arxiv.org/abs/0909.1784: Energy Efficiency: The New Holy Grail of Data Management Systems Research

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.