Skip to Content

Energy-Aware Internet Routing

Software that tracks electricity prices could slash energy costs for big online businesses.
August 17, 2009

An Internet-routing algorithm that tracks electricity price fluctuations could save data-hungry companies such as Google, Microsoft, and Amazon millions of dollars each year in electricity costs. A study from researchers at MIT, Carnegie Mellon University, and the networking company Akamai suggests that such Internet businesses could reduce their energy use by as much as 40 percent by rerouting data to locations where electricity prices are lowest on a particular day.

Data beast: Google maintains a huge datacenter in The Dalles, OR.

Modern datacenters gobble up huge amounts of electricity and usage is increasing at a rapid pace. Energy consumption has accelerated as applications move from desktop computers to the Internet and as information gets transferred from ordinary computers to distributed “cloud” computing services. For the world’s biggest information-technology firms, this means spending upwards of $30 million on electricity every year, by modest estimates.

Asfandyar Qureshi, a PhD student at MIT, first outlined the idea of a smart routing algorithm that would track electricity prices to reduce costs in a paper presented in October 2008. This year, Qureshi and colleagues approached researchers at Akamai to obtain the real-world routing data needed to test the idea. Akamai’s distributed servers cache information on behalf of many large Web sites across the US and abroad, and process some 275 billion requests per day; while the company does not require many large datacenters itself, its traffic data provides a way to model the demand placed on large Internet companies.

The researchers first analyzed 39 months of electricity price data collected for 29 major US cities. Energy prices fluctuate for a variety of reasons, including seasonal changes in supply, fuel price hikes, and changes in consumer demand, and the researchers saw a surprising amount of volatility, even among geographically close locations.

“The thing that surprised me most was that there was no one place that was always cheapest,” says Bruce Maggs, vice president of research at Akamai, who contributed to the project while working as a professor at Carnegie Mellon and is currently a professor at Duke University. “There are large fluctuations on a short timescale.”

The team then devised a routing scheme designed to take advantage of daily and hourly fluctuations in electricity costs across the country. The resulting algorithm weighs up the physical distance needed to route information–because it’s more expensive to move data further–against the likely cost savings from reduced energy use. Data collected from nine Akamai servers, covering 24 days of activity, provided a way to test the routing scheme using real-world data. The team found that, in the best scenario–one in which energy use is proportional to computing–a company could slash its energy consumption by 40 percent. “The results were pretty surprising,” Maggs says.

The ability to throttle back energy consumption could have another benefit for massive Internet companies, the researchers say. If an energy company were struggling to meet demand, it could negotiate for computation to be moved elsewhere; the researchers say that the market mechanisms needed to make this possible are already in place.

Follow the money: This map, compiled by, shows the location of Google’s US datacenters.

Spiraling energy consumption has become a major concern for the world’s largest Web companies; a report published by McKinsey & Company and the Uptime Institute in July 2008 estimates that datacenter energy usage will quadruple during the next decade in the absence of efforts to improve efficiency.

The pressure to reduce costs and curb emissions is forcing datacenter managers to radically rethink design and management. Google recently built a datacenter in Belgium that relies entirely on ambient cooling–on days when the weather gets to warm, the center’s servers are simply shut down. Maggs says that an energy-aware Internet-routing scheme is an extension of this idea. “Resources are getting more fungible and this is the ultimate extension of that,” he says.

“In principle this could work,” says Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory and a consulting professor at Stanford University, who studies information technology energy use and environmental impact. “The trick is to be able to control these systems well enough and to create controls that are cheap enough to be able to take advantage of the arbitrage opportunity available from differential electricity prices, without affecting reliability or latency,” says Koomey.

Maggs cautions that the idea is not guaranteed to reduce energy usage or pollution, only energy costs. “The paper is not about saving energy but about saving cost, although there are some ways to do both,” he says. “You have to hope that those are aligned.”

Furthermore, he warns that the scheme relies on companies’ hardware having some sort of “energy elasticity.” In other words, their servers need to use substantially less power when idle than when running full tilt. This has not always been the case, but Google says that its custom servers consume 65 percent of the normal power when idle.

Michael Manos, senior vice president of Digital Realty Trust, a company that designs, builds, and manages large datacenters, believes that the lack of elasticity currently built into modern hardware makes it impossible to achieve the improvements suggested.

“It is great research but there are some base fundamental problems with the initial assumptions, which would prevent the type of savings they present,” Manos says. Because most servers aren’t used to capacity, he says, “you just can’t get there.”

However, Manos does see plenty of room for improvement in datacenter designs. “I believe the datacenter industry is just beginning to enter into a Renaissance of sorts,” he says. “Technology, economic factors, and a new breed of datacenter managers are forcing change into the industry. It’s a great time to be involved.”

Koomey suggests that spiraling energy costs could encourage some companies to consider radical steps such as rerouting data: “Electricity use is a big enough component of data-center costs that this just might work.”

Keep Reading

Most Popular

How to opt out of Meta’s AI training

Your posts are a gold mine, especially as companies start to run out of AI training data.

How a simple circuit could offer an alternative to energy-intensive GPUs

The creative new approach could lead to more energy-efficient machine-learning hardware.

How gamification took over the world

Gamification was always just behaviorism dressed up in pixels and point systems. Why did we fall for it?

Apple is promising personalized AI in a private cloud. Here’s how that will work.

Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.