Skip to Content

More Data, Less Power

Big computing providers are developing energy-saving strategies for new server farms.
January 13, 2011

When it came time for Hewlett-Packard to decide on a location for its new data center, the company could have considered variables like network connectivity, local talent, or proximity to corporate headquarters. Instead, a 100-year weather report convinced HP to build its new 360,000-square-foot facility in breezy Billingham, England.

Server farm: Yahoo’s data center in Lockport, New York, was inspired by a chicken coop and lets air naturally vent through the top.

“You get a lot of cool and moist winds coming over the northeast coast of Britain,” says Ian Brooks, HP’s European head of sustainable computing. By harnessing these winds with massive fans, Brooks says, HP has created a system that uses 40 percent less energy than conventional methods of keeping data centers cool.

HP isn’t the only company taking its cues from nature when it comes to the design and construction of data centers, clusters of server computers that run Internet services and store and crunch data. These facilities have been the smokestacks of the digital era because they use so much electricity: not only does it take a lot of power to run the machines themselves, but data centers are heavily air conditioned because servers generate a lot of heat and don’t run well in environments much warmer than 25 ºC. As demand for online services skyrockets, the EPA predicts, U.S. data centers could nearly double their 2006 levels of energy consumption by 2011, reaching 100 billion kilowatt-hours per year—enough to power 10 million homes. By 2020, data centers will account for 18 percent of the world’s carbon emissions, according to the Smart 2020 report released by the Climate Group, a nonprofit organization.

To reduce the environmental—and financial—burdens, more and more companies are trying innovative designs for data centers. For instance, at the HP center in Britain, known as Wynyard, fans more than two meters in diameter pull the North Sea winds into a mixing chamber, where they cool the warm air given off by the center’s servers. That air is funneled into a large cavity beneath the servers, directed through vents in the floor, and then circulated throughout a series of aisles to chill the computers. The resulting warm exhaust is extracted, mixed with the incoming fresh air, and recirculated.

By eliminating the need for energy-intensive cooling equipment, the Wynyard facility cuts 12,500 metric tons of carbon dioxide from the total generated by the industry-standard data center. That is the equivalent of taking nearly 3,000 midsize vehicles off the road.

Another innovative data center is one that Yahoo opened in September 2010 in Lockport, New York. In this case, the inspiration came from chicken coops rather than coastal winds. “Chickens throw off a fair bit of heat; servers throw off a fair bit of heat,” says Christina Page, Yahoo’s director of climate and energy strategy. “So we built a long, tall, narrow building with a coop along the top to vent the air.”

Drawn in: At Hewlett-Packard’s data center in Billingham, England, large fans pull in fresh air.

The 155,000-square-foot facility mimics the narrow design of a chicken coop and features louvers along the sides of the building so that prevailing winds can flow freely throughout the halls. On particularly hot days, the center can activate an evaporative cooling system, which uses less energy than traditional chillers. That means the facility uses at least 95 percent less water than a conventional data center, and 40 percent less energy—enough to power more than 9,000 households annually. What’s more, with its preconstructed metal components, the chicken-coop structure can be assembled in less than six months.

“There’s a good case to be made for the return on investment on a lot of green practices,” says Page. “This data center was cheaper and faster to build, in addition to being more efficient on the operating-expenditure side.”

The information-management company Iron Mountain, meanwhile, is taking advantage of natural geothermal conditions to slash energy consumption by locating a data center in a former limestone mine, 22 stories below ground. Iron Mountain’s storage facility in Butler County, Pennsylvania, houses Room 48, whose racks of servers rely on the natural cooling properties of the limestone walls to remain at 13 ºC. Iron Mountain also developed a high-static air pressure differential cooling system that relies on high-velocity ducts, located in the cold aisles separating rows of servers, and linear return ducts in its hot aisles. The system creates winds that naturally cause cold air to sink and hot air to rise and exit the room through perforated ceiling tiles. The absence of air conditioners not only freed up about 30 percent more space in Room 48 but cut energy consumption for cooling by 10 to 15 percent compared with traditional data centers.

These are the kinds of unheralded changes that can really make a difference, says Mark Lafferty, director of strategic solutions at technology services provider CDW. “The really basic, non-glamorous, non-sexy stuff companies do can have a dramatic effect on the amount of resource consumption in a data center,” he says.

Keep Reading

Most Popular

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Sam Altman says helpful agents are poised to become AI’s killer function

Open AI’s CEO says we won’t need new hardware or lots more training data to get there.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.