Intelligent Machines

Computer Clusters That Heat Houses

A novel water-cooling system makes it more efficient for computers to heat buildings.

They have been used to model climate change, forecast economic trends, and simulate the intricate complexities folding proteins. Now IBM has something new in store for high-performance computers: heating buildings.

Keeping cool: A novel on-chip water-cooling system developed by IBM could make it efficient for data centers to provide waste heat for buildings.

Thanks to a novel on-chip water-cooling system developed by the company, the thermal energy from a cluster of computer processors can be efficiently recycled to provide hot water for an office, says Bruno Michel, manager of advanced thermal packaging at IBM’s Zurich Research Laboratory, in Switzerland. The goal, he says, is to improve the energy efficiency of large computing clusters and reduce their environmental impact.

A pilot scheme involving a computer system fitted with the technology is expected to save up to 30 tons of carbon dioxide emissions per year–the equivalent of an 85 percent carbon footprint reduction. A novel network of microfluidic capillaries inside a heat sink is attached to the surface of each chip in the computer cluster, which allows water to be piped to within microns of the semiconductor material itself. Despite its close proximity to the circuitry, there is no danger of leakage, says Michel, because the capillaries are hermetically sealed. By having water flow so close to each chip, heat can be removed more efficiently. Water heated to 60 °C is then passed through a heat exchanger to provide heat that is delivered elsewhere.

IBM has spent several years developing the microfluidic cooling technology, and it plans to test it in partnership with Swiss Federal Institute of Technology, in Zurich. A 10-teraflop computer cluster consisting of two IBM BladeCenter Servers in a single rack will be used by the university’s Computational Science and Engineering Lab to model fluid dynamics for nanotechnology research. The water will then be plumbed into the university’s heating system, where it will help heat 60 buildings. “This is the first large-scale system,” says Michel. “It’s about one-twentieth of the size of an average data center.” Ultimately, he says, the technology could help address the energy problems posed by large data centers.

Up to 50 percent of the energy consumed by a modern data center goes toward air cooling. Most of the heat is then wasted because it is just dumped into the atmosphere. There have been a few efforts to recycle the heat generated by conventional data centers. For example, a nine-story, 18,500-square-meter data center being built in London by the hosting company Telehouse Europe will provide heating for nearby offices. Other companies, including IBM, have used excess thermal energy to heat green houses or swimming pools. But reusing waste heat is expensive because usually only relatively low temperatures can be harvested, says Frank Brand, director of operations of the Dutch data-center engineering firm Imtech. “You can only get about 30 to 35 degrees Celsius,” he says.

In contrast, because water is many times more efficient at capturing heat than air, water cooling can deliver much higher temperatures, says Michel. Water was once commonly used to cool mainframe computers, but this merely consisted of piping cold water through server cabinets to cool the air near the racks.

By some estimates, information technology infrastructure is responsible for as much as 2 percent of global carbon emissions, putting it on a par with aviation. And some experts say that this figure is set to double in the next five years.

“It’s more efficient to heat water and move it somewhere else than it is with air,” says Jonathan Koomey, a project scientist at Lawrence Berkeley National Laboratories and a consulting professor at Stanford University. In 2005, data centers were responsible for 1 percent of global electricity–a doubling of 2000 levels, Koomey says. But he’s not convinced that the figure will continue to grow. “There are many ways to improve the efficiency of data centers,” he says. For example, better management of computer centers can improve efficiencies dramatically. “We have servers that on average are running at 5 to 15 percent of their maximum load,” Koomey says. “Even if the server is doing nothing, it’s still using 60 to 70 percent of its power.”

Brand also notes that “air is a much cheaper way to do the cooling” and that modern data centers consume far less energy than do their older counterparts for cooling.

The trend toward stacking processors on top of each other to increase their power density is another reason why IBM is pursuing this sort of microfluidic water cooling, says Michel. Such three-dimensional chips will pose serious problems for traditional air-based cooling systems, he says.

Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.

Subscribe today

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly home delivery and unlimited 24/7 access to MIT Technology Review’s website.

    The Download. Our daily newsletter of what's important in technology and innovation.

    Access to the Magazine archive. Over 24,000 articles going back to 1899 at your fingertips.

    Special Discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.