Skip to Content

Computer Clusters That Heat Houses

A novel water-cooling system makes it more efficient for computers to heat buildings.

They have been used to model climate change, forecast economic trends, and simulate the intricate complexities folding proteins. Now IBM has something new in store for high-performance computers: heating buildings.

Keeping cool: A novel on-chip water-cooling system developed by IBM could make it efficient for data centers to provide waste heat for buildings.

Thanks to a novel on-chip water-cooling system developed by the company, the thermal energy from a cluster of computer processors can be efficiently recycled to provide hot water for an office, says Bruno Michel, manager of advanced thermal packaging at IBM’s Zurich Research Laboratory, in Switzerland. The goal, he says, is to improve the energy efficiency of large computing clusters and reduce their environmental impact.

A pilot scheme involving a computer system fitted with the technology is expected to save up to 30 tons of carbon dioxide emissions per year–the equivalent of an 85 percent carbon footprint reduction. A novel network of microfluidic capillaries inside a heat sink is attached to the surface of each chip in the computer cluster, which allows water to be piped to within microns of the semiconductor material itself. Despite its close proximity to the circuitry, there is no danger of leakage, says Michel, because the capillaries are hermetically sealed. By having water flow so close to each chip, heat can be removed more efficiently. Water heated to 60 °C is then passed through a heat exchanger to provide heat that is delivered elsewhere.

IBM has spent several years developing the microfluidic cooling technology, and it plans to test it in partnership with Swiss Federal Institute of Technology, in Zurich. A 10-teraflop computer cluster consisting of two IBM BladeCenter Servers in a single rack will be used by the university’s Computational Science and Engineering Lab to model fluid dynamics for nanotechnology research. The water will then be plumbed into the university’s heating system, where it will help heat 60 buildings. “This is the first large-scale system,” says Michel. “It’s about one-twentieth of the size of an average data center.” Ultimately, he says, the technology could help address the energy problems posed by large data centers.

Up to 50 percent of the energy consumed by a modern data center goes toward air cooling. Most of the heat is then wasted because it is just dumped into the atmosphere. There have been a few efforts to recycle the heat generated by conventional data centers. For example, a nine-story, 18,500-square-meter data center being built in London by the hosting company Telehouse Europe will provide heating for nearby offices. Other companies, including IBM, have used excess thermal energy to heat green houses or swimming pools. But reusing waste heat is expensive because usually only relatively low temperatures can be harvested, says Frank Brand, director of operations of the Dutch data-center engineering firm Imtech. “You can only get about 30 to 35 degrees Celsius,” he says.

In contrast, because water is many times more efficient at capturing heat than air, water cooling can deliver much higher temperatures, says Michel. Water was once commonly used to cool mainframe computers, but this merely consisted of piping cold water through server cabinets to cool the air near the racks.

By some estimates, information technology infrastructure is responsible for as much as 2 percent of global carbon emissions, putting it on a par with aviation. And some experts say that this figure is set to double in the next five years.

“It’s more efficient to heat water and move it somewhere else than it is with air,” says Jonathan Koomey, a project scientist at Lawrence Berkeley National Laboratories and a consulting professor at Stanford University. In 2005, data centers were responsible for 1 percent of global electricity–a doubling of 2000 levels, Koomey says. But he’s not convinced that the figure will continue to grow. “There are many ways to improve the efficiency of data centers,” he says. For example, better management of computer centers can improve efficiencies dramatically. “We have servers that on average are running at 5 to 15 percent of their maximum load,” Koomey says. “Even if the server is doing nothing, it’s still using 60 to 70 percent of its power.”

Brand also notes that “air is a much cheaper way to do the cooling” and that modern data centers consume far less energy than do their older counterparts for cooling.

The trend toward stacking processors on top of each other to increase their power density is another reason why IBM is pursuing this sort of microfluidic water cooling, says Michel. Such three-dimensional chips will pose serious problems for traditional air-based cooling systems, he says.

Keep Reading

Most Popular

10 Breakthrough Technologies 2024

Every year, we look for promising technologies poised to have a real impact on the world. Here are the advances that we think matter most right now.

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

AI for everything: 10 Breakthrough Technologies 2024

Generative AI tools like ChatGPT reached mass adoption in record time, and reset the course of an entire industry.

What’s next for AI in 2024

Our writers look at the four hot trends to watch out for this year

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.