Global data centers are unprepared for the climate crisis

Global data centers are unprepared for the climate crisis

Data centers of major technology companies located in the USA and Europe went offline due to overheating, cutting off user access to key services.

Global data centers are unprepared for the climate crisis

Meteorologists believe it’s time to prepare for a new normal, which means not only increasing the cooling capacity but also altering the very approach to the design of data processing centers.

In late July, when record temperatures were registered in Great Britain, Google Cloud’s data centers in London went offline for a day, due to cooling failures. The impact was not limited to the area surrounding the data center. Since it serves customers in the USA and Pacific region, the outages limited their access to key Google services for several hours as well.  

Oracle, a London-based data center, was hit by the heat too, which interrupted the work of its U.S. clients. The company cited “unseasonable temperature” as the reason for the outages.

The UK Meteorological Office thinks data processing centers should prepare for a new normal.

According to the estimates of the World Meteorological Organization, there is a 93 % chance that new temperature records will have been set by 2026. 

“For as long as we continue to emit greenhouse gases, temperatures will continue to rise,” says Petteri Taalas, WMO Secretary-General.

He then adds, “And alongside that, our oceans will continue to become warmer and more acidic, sea ice and glaciers will continue to melt, sea level will continue to rise, and our weather will become more extreme.”

The weather shift will affect all the infrastructure built by people, including data centers, which keep the planet’s collective knowledge online. The question is whether they are prepared for the new conditions.

According to a survey issued by Uptime Institute, a digital services standards agency, 45 % of U.S.-based data centers have already dealt with extreme weather events jeopardizing their performance.

Director of Operational Intelligence, a UK consulting firm, Sophia Flucker explains that building data center cooling systems may involve temperature data analysis. The data are usually provided by weather stations located in close proximity to the future data centers. 

However, such approach has a serious flaw: more often than not, data are historical and represent a time period previous to temperature records.

“It wasn’t that long ago that we were designing cooling systems for a peak outdoor temperature of 32 °C,” says Jon Healy, of the UK data center consultancy Keysource. “They’re over 8 °C higher than they were ever designed for.”

Requirements for equipment are becoming increasingly stringent, but both developers and their clients are still focused on making a profit. The data collected by the Turner & Townsend consultancy show that over the past few years, data center construction costs have increased in almost every market, and construction companies are advised to cut their costs.

Mr. Healy further explains that, to raise the bar from 32 °C to 42 °C, the equipment should meet significantly more stringent requirements. Today companies engaged in data center design are starting to use forecasts instead of historical data.

Ms. Flucker points out that data centers rarely operate at full capacity. Research conducted by Cushman & Wakefield shows that of the 55 data center markets profiled, eight operate at 95 % or higher capacity and are currently strained by higher temperatures only several days a year.

Data processing centers that do not operate at 100 % capacity handle high external temperatures better because equipment failures are less likely to affect their performance. Nevertheless, that will most certainly change, since the climate emergency will lead to the temperature of the environment constantly changing, the margin of error narrowing.

The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) developed thermal guidelines for data processing equipment, such as servers. For instance, it is recommended that supply air inlet temperatures in data centers do not exceed 27 °C.

However, since the temperature continues to rise, data centers need to adjust.

“There are a deceptively large number of legacy data center sites built by banks and financial services companies needing to be refreshed and refitted,” says Simon Harris, Head of Critical Infrastructure at Business Critical Solutions, a consulting firm.

He recommends that companies look into the design criteria that would allow to handle the climate change rather than minimizing its impact. For instance, they could use bigger chiller machines and machines with bigger condensers.

Companies are already testing some unusual solutions to these problems: from 2018 to 2020, Microsoft carried out the Natick project, placing a data center 35.7 meters underwater off the coast of Scotland in order to, among other things, remove the effects of temperature fluctuations.

According to Mr. Harris, building data centers in the northern regions could be one way to avoid the heat, but this comes with its own set of issues.

“Developers will be fighting over an ever-dwindling pool of potential sites,” he says. The problem is that data processing centers have been recently moving closer to the areas where data are consumed, and these are often hotter urban areas.

Data centers are currently using the air-cooling technology. Liquid cooling, on the other hand, could be much more effective, but this solution is not widely used due to its complexity. It is worth noting that cooling systems are the second biggest energy consumers inside data centers after IT components.

“If we can move away from the traditional way of doing things, it’s preventing climate change in the first place,” Ms. Flucker says.


https://rb.ru/story/data-centers-climate-change/


Other news