Cooling for High[ER] Density Spaces Should be a Design Feature, Not an Afterthought

June 2, 2021
Higher density demands a specialized cooling strategy, yet many data center operators cool the entire room rather than the equipment inside. Doug Ausdemore of Data Aire explains how higher density data centers benefit from a specialized cooling strategy. 

In this edition of Voices of the Industry, Doug Ausdemore, Senior Product Development Manager, Data Aire explains how higher density data centers benefit from a specialized cooling strategy.

Doug Ausdemore, Senior Product Development Manager, Data Aire

Few data centers live in a world of ‘high’ density, a number that is a moving target, but many are moving to high[er] density environments. Owners of higher density data centers often aren’t aware of how many variables factor into cooling their equipment. The result is that they spend too much on shotgun solutions that waste capacity when they would be better served by taking a rifle shot approach. This means understanding the heat dispersion characteristics of each piece of equipment and optimizing floor plans and the placement of cooling solutions for maximum efficiency.

So, how do you invest in today and plan for tomorrow? By engaging early in the data center design process with a cooling provider that has a broad line of cooling solutions, owners can maximize server space, minimize low pressure areas, reduce costs, save on floor space and boost overall efficiency. And by choosing a provider that can scale with their data center, they can ensure that their needs will be met long into the future.

Density is Growing: Low to Medium to High[er] and Highest

Data centers are growing increasingly dense, creating unprecedented cooling challenges. That trend will undoubtedly continue. The Uptime Institute’s 2020 Data Center survey found that the average server density per rack has more than tripled from 2.4 kW to 8.4 kW over the last nine years. While still within the safe zone of most conventional cooling equipment, the trend is clearly toward equipment running hotter, a trend accelerated by the growing use of GPUs and multi-core processors. Some higher-density racks now draw as much as 16 kW per rack, and the highest-performance computing is demanding typically up 40-50 kW per rack.

High[er] Density Requires Dedicated Cooling Strategies

For the sake of discussion, let’s focus on the data centers that are, or may be, in the 8.4-16 kW range in the near future.  This higher density demands a specialized cooling strategy, yet many data center operators waste money by provisioning equipment to cool the entire room rather than the equipment inside. In fact, “Overprovisioning of power/cooling is probably more common issue than under provisioning due to rising rack densities,” the Uptime survey asserted.

No two data centers are alike and there is no one-size-fits-all cooling solution. Thermal controls should be customized to the server configuration and installed in concert with the rest of the facility, or at least six months before the go-live date.

Equipment in the higher density range of 8-16 kw can present unique challenges to precision cooling configurations. The performance of the servers themselves can vary from rack to rack, within a rack and even with the time of day or year, causing hotspots to emerge.

Higher-density equipment creates variable hot and cool spots that need to be managed differently. A rack that is outfitted with multiple graphic processing units for machine learning tasks generates considerably more heat than one that processes database transactions. Excessive cabling can restrict the flow of exhaust air. Unsealed floor openings can cause leakages that prevent conditioned air from reaching the top of the rack. Unused vertical space can cause hot exhaust to feed back into the equipment’s intake ducts, causing heat to build up and threatening equipment integrity.

For all these reasons, higher-density equipment is not well-served by a standard computer room air conditioning (CRAC) unit. Variable speed direct expansion CRAC equipment scales up and down gracefully to meet demand. This not only saves money but minimizes power surges that can cause downtime. Continuous monitoring should be put in place using sensors to detect heat buildup in one spot that may threaten nearby equipment. Alarms should be set to flag critical events without triggering unnecessary firefighting. Cooling should also be integrated into the building-wide environmental monitoring systems.

A better approach to developing or upgrading a data center is to build cooling plans into the design of the data center from the beginning with a holistic approach that minimizes hot spots. Alternating “hot” and “cold” aisles should be created with vented floor tiles in the cold aisles and servers arranged to exhaust all hot air into an unvented hot aisle. The choice of front discharge, up flow and down flow ventilation can prevent heat from being inadvertently circulated back into the rack. Power distribution also needs to be planned carefully and backup power provisioned to avoid loss of cooling.

Thinking through cooling needs early in the data center design stage for higher density data centers avoids costly and disruptive retrofits down the road. The trajectory of power density is clear, so cooling design should consider not only today’s needs but those five and 10 years from now. Modular, and variable capacity systems can scale and grow as needed.

The earlier data center owners involve their cooling providers in their design decisions the more they’ll save from engineered-to-order solutions and the less risk they’ll have of unpleasant surprises down the road.

Doug Ausdemore is Senior Product Development Manager, Data Aire of Data Aire. Contact them to learn more about designing a cooling system for your higher density environment.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Tackling Utility Project Challenges with Fiberglass Conduit Elbows

Explore how fiberglass conduit elbows tackle utility project challenges like high costs, complex installations, and cable damage. Discover the benefits of durable, cost-efficient...

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Anggalih Prasetya/Shutterstock.com
Source: Anggalih Prasetya/Shutterstock.com

AI in the Data Center: Building Partnerships for Success

Wesco’s Alan Farrimond explains how the right partnerships can help data centers overcome the barriers to growth and meet the demands of AI.

White Papers

Get this interactive report on How Edge Simplifies Digital Transformation

The Cloud is Better at the Edge

March 3, 2021
This white paper explains how edge computing helps to solve many of the challenges associated with using the cloud for your digital transformation strategy.