ORLANDO, Fla. - In most cases, data center operators don’t have 2020’s American Innovation & Manufacturing (AIM) Act at the forefront of their operational concerns. And in fact, very little of the AIM act actually concerns their business processes. But one piece of it gave the EPA the ability to address the impact of hydrofluorocarbons (HFC) on the environment, and that may have implications for your data center.
Since HFCs represent the most concerning of the greenhouse gases in common use, there are now a series of regulations in process that will limit the ability of users and manufactures to continue using HFC-based refrigerants.
Refrigerants using HFCs are used in some data center equipment, including chillers, computer room air conditioning (CRAC) units and some in-row cooling systems. Many large data center operators have worked to reduce their reliance on chillers and switched to free cooling strategies that use fresh air and air-handling units. But refrigerants are still commonly used in enterprise data centers, as well as to support hyperscale and service provider data centers in warmer climates where free cooling is not practical.
One solution may be data center cooling systems using CO2, which has been widely used in industrial cooling systems but is less common in data centers. A session at this week's 7x24 Exchange Spring Conference in Orlando featured a discussion of the timelines for HFC products and the potential for CO2 in data centers.
The Path Forward for HFC Refrigerants
According to a presentation at the 7x24 Exchange Spring Conference by Michael May, President & CTO of Effecterra, the phasedown in the availability of starts at the end of 2023 based on the global warming potential (GWP) of the HFC. By January 1, 2025, most GWP limits will go into effect, with certain common refrigerants, such as R410A and R134a being explicitly banned for use in industrial process cooling. This will have a direct impact on the repair and replacement of existing refrigerant systems and limit or halt the production of new equipment that makes use of HFC for the refrigerant.
To further complicate the issue, a regulatory decision has not yet been made about whether to classify data center cooling as comfort cooling or industrial process cooling, two categories which will see different regulations as well as different implementation dates.
The first thought for most is to replace HFCs with HFOs, or hydrofluoroolefins, which are made of made of hydrogen, fluorine, and carbon and have zero ozone depletion potential, unlike HFCs. But HFOs are potentially toxic and are on the verge of being banned in Europe and in some US states.
So what’s a data center operator to do? Regardless of the hype surrounding the availability of liquid cooling solutions, many data centers will continue to work with HFC-based equipment for the foreseeable future. Investing in refrigeration for new projects may put the operators of these data centers in a bind as these regulations come into effect.
CO2 Cooling Offers Efficiency, Sustainability
In his presentation at the 2023 7x24 Exchange conference, Jacob Wolfe of M&M Carnot advocates the use of natural refrigerant in the form of CO2 as the solution to this issue. While ammonia is also a contender for natural refrigerant use, CO2 is the most environmentally benign choice. It’s also in very common use, as Wolfe points out that if you’ve been in a grocery store, you are intimately familiar with CO2 cooling, whether you are aware of it or not.
Wolfe, the North American Data Center representative for M&M Carnot, points out that the advantages of using CO2, relative to other refrigerants, are primarily its more efficient operation. Its volumetric cooling capacity is four to five times greater than most other refrigerants, which results in its use requiring smaller compressors and components , less refrigerant, and a smaller equipment footprint.
He does admit, however, that it is most effective when the ambient temperatures are below 88F. In environments where the temperatures are consistently at or above that range, it is necessary to pair the CO2 refrigerant systems with adiabatic gas cooling to mitigate the impact of operations when the system is operating in what is called the transcritical range, ambient temperature exceeding 88F, is where the condenser becomes a gas cooler, losing effectiveness. With the use of the adiabiatic gas cooling, efficient cooling is possible up to ambient temperatures of 95F. The transcritical range indicates full compression mode to keep the CO2 at it’s operating pressure range of 800 to 1040 psi.
This identifies the least efficient mode of cooling. The most efficient mode, what Carnot calls free cooling, works up to an ambient temperature of 54F, and makes use of the thermosiphon effect to circulate the coolant. No compression is required and the operation of the chiller reaches peak efficiency.
When the ambient temperature is above 54F but below 65F, the CO2 compressor splits its operation, where one circuit runs in free cooling mode, while the other runs in compression mode. As Wolfe said, north of the Mason-Dixon line, the use of CO2 refrigerant in your chillers can be very efficient and cost effective. As you move into areas with higher average temperature, CO2 refrigerant solutions will likely require combining this technology with adiabatic coolers for peak efficiency.
Users of M&M Carnot's CO2 cooling system include Telus, which incorporated a CO2 CRAC system into an update of a data center in Quebec.
While CO2-based cooling isn’t the ultimate panacea for solving data center cooling concerns, it does an excellent job of addressing environmental concerns, and has been successfully applied in Europe - which as May pointed out, is roughly 10 years ahead of the US in regulatory restrictions in the use of HFC and HFO refrigerants. Getting ahead of the curve in dealing with pending regulatory issues is not a bad thing when evaluating technologies for your next generation data centers. And taking a look at the potential of CO2 cooling for your data centers is worth the time and effort.