Pages

4 August 2022

Data Centers Are Facing a Climate Crisis


WHEN RECORD TEMPERATURES wracked the UK in late July, Google Cloud’s data centers in London went offline for a day, due to cooling failures. The impact wasn’t limited to those near the center: That particular location services customers in the US and Pacific region, with outages limiting their access to key Google services for hours. Oracle’s cloud-based data center in the capital was also struck down by the heat, causing outages for US customers. Oracle blamed “unseasonal temperatures” for the blackout.

The UK Met Office, which monitors the weather, suggests that the record heat was an augur of things to come, which means data centers need to prepare for a new normal.

The World Meteorological Organization (WMO) says there’s a 93 percent chance that one year between now and 2026 will be the hottest on record. Nor will that be a one-off. “For as long as we continue to emit greenhouse gases, temperatures will continue to rise,” says Petteri Taalas, WMO secretary general. “And alongside that, our oceans will continue to become warmer and more acidic, sea ice and glaciers will continue to melt, sea level will continue to rise, and our weather will become more extreme.”

That weather shift will have an impact on all human-made infrastructure—including the data centers that keep our planet’s collective knowledge online.

The question is whether they are prepared. “From my point of view, there is an issue with existing data center stock that’s been built in the UK and Europe,” says Simon Harris, head of critical infrastructure at data center consultancy Business Critical Solutions. But it doesn’t stop there. Forty-five percent of US data centers have experienced an extreme weather event that threatened their ability to operate, according to a survey by the Uptime Institute, a digital services standards agency.

Data center cooling systems are built using a complicated, multi-stage process, says Sophia Flucker, director at UK data center consulting firm Operational Intelligence. This may include analyzing temperature data from a weather station close to the point where the data center will be built.

The problem? That data is historical and represents a time when temperatures in the UK didn’t hit 40 degrees Celsius. “We’re on the fringes of a changing climate,” says Harris.


“It wasn’t that long ago that we were designing cooling systems for a peak outdoor temperature of 32 degrees,” says Jon Healy, of the UK data center consultancy Keysource. “They’re over 8 degrees higher than they were ever designed for.” The design conditions are being increasingly elevated—but data center companies, and the clients they’re working for, operate as profit-driven enterprises. Data from consultancy Turner & Townsend suggests that the cost of building data centers has risen in almost every market in recent years, and construction companies are advised to keep costs down.

“If we went from 32 degrees to 42 degrees, blimey,” says Healy. “You’re having to make everything significantly larger to support that very small percentage of the year” when temperatures rise. “It’s got to be done with caution.”

Data center design companies are starting to consider the historical weather information outmoded and beginning to use projected future temperatures, says Flucker. “Rather than thinking my extreme is 35 degrees, they’re doing projections saying maybe it’s more like 37 or 38 degrees,” she says. “But of course, that’s only as good as how well we can predict the future.”

Flucker points out that data centers rarely operate at full capacity—although Cushman & Wakefield research shows that eight data center markets worldwide out of 55 they investigated operate at 95 percent or higher capacity—and at present, they’re only strained at the highest temperatures for a small number of days a year. Data centers that don’t operate at 100 percent capacity can cope better with high external temperatures because equipment failure is less likely to have an all-or-nothing impact on performance. But that will almost certainly change as the climate emergency begins to permanently alter our environmental temperatures and the margin for error narrows.

The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) has developed operating temperature guidelines for data processing equipment, such as the servers integral to data centers. The limits suggest air pumped through data centers be supplied at no more than 27 degrees Celsius—though there are allowable ranges beyond that. “The world doesn’t end,” says Flucker. “All this equipment is warrantied up to 32 degrees Celsius.” But with temperatures continuing to rise, data centers need to make changes.

“There are a deceptively large number of legacy data center sites built by banks and financial services companies needing to be refreshed and refitted,” says Harris. As part of that rethink, Harris advises companies to look at design criteria that can cope with climate change, rather than solely minimizing its effects. “It’ll be bigger chiller machines, machines with bigger condensers, and looking more at machines that use evaporative cooling to achieve the performance criteria needed to ensure that for those days things are still in a good place,” he says.

Companies are testing some unusual ways to tackle these challenges: Between 2018 and 2020 Microsoft ran Project Natick, which sunk a data center 117 feet below the sea offshore Scotland to insulate it from temperature fluctuations, among other things. Harris says that building data centers in ever more northern climates could be one way to avoid the heat—by trying to outrun it—but this comes with its own problems. “Developers will be fighting over an ever-dwindling pool of potential sites,” he says, a challenge when edge computing puts data centers ever closer to the point at which data is consumed, often in hotter, urban areas.

Liquid cooling technology offers a more practical solution. Data centers are currently in an era of air-based cooling, but liquid cooling—where liquid is passed by equipment, transferring the heat and syphoning it away—could be a better way to keep temperatures down. However, it isn’t widely used because it requires a combined knowledge of cooling and IT equipment. “At the moment, these are two very separate worlds,” says Flucker. “There’s definitely some apprehension about making such a big change in how we do things.”

But it may well be necessary—not least because it sets up a virtuous circle. Outside of the IT equipment itself, the next-biggest consumer of energy in data centers is the equipment used to keep it cool. “If we can move away from the traditional way of doing things,” says Flucker, “it’s preventing climate change in the first place.”

No comments:

Post a Comment