Search to Be Processed in the Cheapest Time Zone
Unlike water and commodities, the price we pay for electricity has the potential to vary with the time of day. Starting with this basic premise, researchers at Simon Fraser University in Burnaby, Canada hit on a novel idea: why not mitigate the millions of dollars spent on electricity for data centers each year by sending requests – web searches and the like – to data centers based on which one is currently paying the least for electricity.
Typically, requests sent from a web browser – for a website, a piece of media, search results – go to the nearest data center, so as to minimize latency, or the time between when a request is made and the result loads on a user’s device. But Ananth Narayan S. and colleagues created a model of data center performance in which the destination of requests is informed by an entirely different metric: the price of power in the local electricity market inhabited by a data center.
Routing requests in this way had a non-trivial impact on the cost of electricity for all the data centers owned by a hypothetical service provider: in the low case, it saved 21% of the electricity that would have been used if requests were simply sent to the nearest data center. For a Google or an Amazon, that could amount to many millions of dollars over the course of a year, not to mention all the tons of greenhouse gasses associated with producing that power in the first place.
This simple tweak to the algorithm of request direction works because power prices can vary tremendously according to time of day. For the same reason that owners of electric cars are advised to charge their vehicles overnight, there is ample reason to send a request halfway around the world in search of cheaper electricity prices.
Of course, the cost of electricity must be balanced against latency, or you’ll end up with requests that take forever to answer, unacceptably long page-load times, etc.; the 21% power savings figure mentioned above takes this into consideration, which means that in theory at least, it’s possible to save a great deal of money simply by knowing the local price of electricity.
While the researchers don’t address this dimension of their work directly, it’s possible to look at this system as a sort of auction, in which data centers bid to take requests from users: the winning bid is the optimal combination of both price and speed. It’s a principle native to every other part of our market-based economy, but not yet a part of how we distribute the workload of the global hive-mind across the ever-expanding array of data-processing nodes on which it relies.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
ChatGPT is going to change education, not destroy it
The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.
Meet the people who use Notion to plan their whole lives
The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.