A View from Christopher Mims
Search to Be Processed in the Cheapest Time Zone
Normally, requests are sent from Web browsers to the nearest data center. But what if they were sent to the one currently paying the least for electricity?
(cc) Rang Hsiao
Unlike water and commodities, the price we pay for electricity has the potential to vary with the time of day. Starting with this basic premise, researchers at Simon Fraser University in Burnaby, Canada hit on a novel idea: why not mitigate the millions of dollars spent on electricity for data centers each year by sending requests – web searches and the like – to data centers based on which one is currently paying the least for electricity.
Typically, requests sent from a web browser – for a website, a piece of media, search results – go to the nearest data center, so as to minimize latency, or the time between when a request is made and the result loads on a user’s device. But Ananth Narayan S. and colleagues created a model of data center performance in which the destination of requests is informed by an entirely different metric: the price of power in the local electricity market inhabited by a data center.
Routing requests in this way had a non-trivial impact on the cost of electricity for all the data centers owned by a hypothetical service provider: in the low case, it saved 21% of the electricity that would have been used if requests were simply sent to the nearest data center. For a Google or an Amazon, that could amount to many millions of dollars over the course of a year, not to mention all the tons of greenhouse gasses associated with producing that power in the first place.
This simple tweak to the algorithm of request direction works because power prices can vary tremendously according to time of day. For the same reason that owners of electric cars are advised to charge their vehicles overnight, there is ample reason to send a request halfway around the world in search of cheaper electricity prices.
Of course, the cost of electricity must be balanced against latency, or you’ll end up with requests that take forever to answer, unacceptably long page-load times, etc.; the 21% power savings figure mentioned above takes this into consideration, which means that in theory at least, it’s possible to save a great deal of money simply by knowing the local price of electricity.
While the researchers don’t address this dimension of their work directly, it’s possible to look at this system as a sort of auction, in which data centers bid to take requests from users: the winning bid is the optimal combination of both price and speed. It’s a principle native to every other part of our market-based economy, but not yet a part of how we distribute the workload of the global hive-mind across the ever-expanding array of data-processing nodes on which it relies.