More Data, Less Power
When it came time for Hewlett-Packard to decide on a location for its new data center, the company could have considered variables like network connectivity, local talent, or proximity to corporate headquarters. Instead, a 100-year weather report convinced HP to build its new 360,000-square-foot facility in breezy Billingham, England.
“You get a lot of cool and moist winds coming over the northeast coast of Britain,” says Ian Brooks, HP’s European head of sustainable computing. By harnessing these winds with massive fans, Brooks says, HP has created a system that uses 40 percent less energy than conventional methods of keeping data centers cool.
HP isn’t the only company taking its cues from nature when it comes to the design and construction of data centers, clusters of server computers that run Internet services and store and crunch data. These facilities have been the smokestacks of the digital era because they use so much electricity: not only does it take a lot of power to run the machines themselves, but data centers are heavily air conditioned because servers generate a lot of heat and don’t run well in environments much warmer than 25 ºC. As demand for online services skyrockets, the EPA predicts, U.S. data centers could nearly double their 2006 levels of energy consumption by 2011, reaching 100 billion kilowatt-hours per year—enough to power 10 million homes. By 2020, data centers will account for 18 percent of the world’s carbon emissions, according to the Smart 2020 report released by the Climate Group, a nonprofit organization.
To reduce the environmental—and financial—burdens, more and more companies are trying innovative designs for data centers. For instance, at the HP center in Britain, known as Wynyard, fans more than two meters in diameter pull the North Sea winds into a mixing chamber, where they cool the warm air given off by the center’s servers. That air is funneled into a large cavity beneath the servers, directed through vents in the floor, and then circulated throughout a series of aisles to chill the computers. The resulting warm exhaust is extracted, mixed with the incoming fresh air, and recirculated.
By eliminating the need for energy-intensive cooling equipment, the Wynyard facility cuts 12,500 metric tons of carbon dioxide from the total generated by the industry-standard data center. That is the equivalent of taking nearly 3,000 midsize vehicles off the road.
Another innovative data center is one that Yahoo opened in September 2010 in Lockport, New York. In this case, the inspiration came from chicken coops rather than coastal winds. “Chickens throw off a fair bit of heat; servers throw off a fair bit of heat,” says Christina Page, Yahoo’s director of climate and energy strategy. “So we built a long, tall, narrow building with a coop along the top to vent the air.”
The 155,000-square-foot facility mimics the narrow design of a chicken coop and features louvers along the sides of the building so that prevailing winds can flow freely throughout the halls. On particularly hot days, the center can activate an evaporative cooling system, which uses less energy than traditional chillers. That means the facility uses at least 95 percent less water than a conventional data center, and 40 percent less energy—enough to power more than 9,000 households annually. What’s more, with its preconstructed metal components, the chicken-coop structure can be assembled in less than six months.
“There’s a good case to be made for the return on investment on a lot of green practices,” says Page. “This data center was cheaper and faster to build, in addition to being more efficient on the operating-expenditure side.”
The information-management company Iron Mountain, meanwhile, is taking advantage of natural geothermal conditions to slash energy consumption by locating a data center in a former limestone mine, 22 stories below ground. Iron Mountain’s storage facility in Butler County, Pennsylvania, houses Room 48, whose racks of servers rely on the natural cooling properties of the limestone walls to remain at 13 ºC. Iron Mountain also developed a high-static air pressure differential cooling system that relies on high-velocity ducts, located in the cold aisles separating rows of servers, and linear return ducts in its hot aisles. The system creates winds that naturally cause cold air to sink and hot air to rise and exit the room through perforated ceiling tiles. The absence of air conditioners not only freed up about 30 percent more space in Room 48 but cut energy consumption for cooling by 10 to 15 percent compared with traditional data centers.
These are the kinds of unheralded changes that can really make a difference, says Mark Lafferty, director of strategic solutions at technology services provider CDW. “The really basic, non-glamorous, non-sexy stuff companies do can have a dramatic effect on the amount of resource consumption in a data center,” he says.
Geoffrey Hinton tells us why he’s now scared of the tech he helped build
“I have suddenly switched my views on whether these things are going to be more intelligent than us.”
Deep learning pioneer Geoffrey Hinton has quit Google
Hinton will be speaking at EmTech Digital on Wednesday.
Video: Geoffrey Hinton talks about the “existential threat” of AI
Watch Hinton speak with Will Douglas Heaven, MIT Technology Review’s senior editor for AI, at EmTech Digital.
Doctors have performed brain surgery on a fetus in one of the first operations of its kind
A baby girl who developed a life-threatening brain condition was successfully treated before she was born—and is now a healthy seven-week-old.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.