Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

Unlike water and commodities, the price we pay for electricity has the potential to vary with the time of day. Starting with this basic premise, researchers at Simon Fraser University in Burnaby, Canada hit on a novel idea: why not mitigate the millions of dollars spent on electricity for data centers each year by sending requests – web searches and the like – to data centers based on which one is currently paying the least for electricity.

Typically, requests sent from a web browser – for a website, a piece of media, search results – go to the nearest data center, so as to minimize latency, or the time between when a request is made and the result loads on a user’s device. But Ananth Narayan S. and colleagues created a model of data center performance in which the destination of requests is informed by an entirely different metric: the price of power in the local electricity market inhabited by a data center.

Routing requests in this way had a non-trivial impact on the cost of electricity for all the data centers owned by a hypothetical service provider: in the low case, it saved 21% of the electricity that would have been used if requests were simply sent to the nearest data center. For a Google or an Amazon, that could amount to many millions of dollars over the course of a year, not to mention all the tons of greenhouse gasses associated with producing that power in the first place.

This simple tweak to the algorithm of request direction works because power prices can vary tremendously according to time of day. For the same reason that owners of electric cars are advised to charge their vehicles overnight, there is ample reason to send a request halfway around the world in search of cheaper electricity prices.

Of course, the cost of electricity must be balanced against latency, or you’ll end up with requests that take forever to answer, unacceptably long page-load times, etc.; the 21% power savings figure mentioned above takes this into consideration, which means that in theory at least, it’s possible to save a great deal of money simply by knowing the local price of electricity.

While the researchers don’t address this dimension of their work directly, it’s possible to look at this system as a sort of auction, in which data centers bid to take requests from users: the winning bid is the optimal combination of both price and speed. It’s a principle native to every other part of our market-based economy, but not yet a part of how we distribute the workload of the global hive-mind across the ever-expanding array of data-processing nodes on which it relies.

Follow Mims on Twitter or contact him via email.

4 comments. Share your thoughts »

Tagged: Computing

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »