Researchers at Cambridge University want to put data centers in places so remote they aren’t on any power grid. Their models indicate that moving data-hungry computation to places such as scorching deserts, windswept peaks, and the middle of the Atlantic Ocean—all rich in sunlight and wind energy—could allow this otherwise unharvestable energy to do useful work.
In a paper to be delivered at the 13th annual HotOS conference in May, the authors offer an extreme model of how cloud services could incorporate remote data centers powered only by renewable energy. Their scenario sites one solar- and wind-powered data center in the desert of southwest Australia and a second one in Egypt, on other side of the planet. This placement is no accident: putting them in different hemispheres, on opposite sides of the earth, maximizes the solar and wind energy they can harvest.
One catalyst for such a radical rethinking of how data centers can be sited and powered is the increasing availability of advanced fiber-optic networks. Connecting a remote renewable-energy plant to a power grid remains prohibitively expensive, reasoned the researchers working on this project—Sherif Akoush, Ripduman Sohan, Andrew Rice, Andrew W. Moore, and Andy Hopper—but running fiber-optic cable to such a plant would be relatively easy and cheap.
“We envisage data centers being put in places where renewable energy is being produced and you could never economically bring it back to heat a house,” says Andy Hopper, senior author on the paper and head of Cambridge University’s computer science department. “But you could lay a fiber and use energy that is otherwise lost, in that it’s not economically transportable.” One way to think of the underlying principle, he notes, is that it’s easier to move bits (made up of photons) than electrons.
Jonathan Koomey, a researcher and consulting professor at Stanford, cautions that a number of real-world factors could render the Cambridge team’s hypotheticals invalid. While data centers are costly, Koomey explains, the value they create is so far in excess of those costs that anything that reduces their effectiveness would reduce their net benefit to society.
“If the actions you take to save costs would also cut into the number of computations that you can then deliver, you’ll reduce economic benefits from data centers, and that’s presumably not what the authors had in mind,” says Koomey.
Hopper, however, points out that the larger effort of which this paper is a part—the Computing for the Future of the Planet project—takes it as a given that more computing is always good, because the virtualization of goods and services displaces more energy-intensive activities in the physical world. He says that a system like the one he proposes would be implemented only at either “no cost to overall performance [of a cloud computing system] or at an attractive cost to performance.”
When designing an embedded system choosing which tools to use often comes down to building a custom solution or buying off-the-shelf tools.