Roundtable: New Generation of AI Hardware Raises the Bar on Cooling

Sept. 22, 2020
As AI brings more powerful chips into the data center, our DCF Executive Roundtable weighs in on the present and future of rack density, and how it may influence data center equipment and design.

Today we continue our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of experienced data center executives weighs in on the present and future of rack density in the age of artificial intelligence. Our panelists include Phillip Marangella of EdgeConneX, CoreSite’s Juan Font, Angie McMillin from Vertiv, Kevin Facinelli from Nortek, Digital Realty’s Tony Bishop and Jaime Leverton from eStruxture Data Centers and Infrastructure Masons.

The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.

Data Center Frontier: Artificial intelligence is bringing more powerful chips into the data center. What’s your take on the present and future of rack density, and how it may influence data center equipment and design?

KEVIN FACINELLI, Nortek Air Solutions

Kevin Facinelli: Industry statistics indicate average rack density has increased from 7.2 kW in 2018 to 8.2 kW in 2019. Averages can be misleading because AI and computational rigorous applications can range from 15 to 30 kW.

Data center operators have to realize which technology offers future relevance. Rack densities have climbed exponentially since we entered the liquid cooling market 25 years ago, and our liquid cooling equipment is now in 10 percent of the top 100 supercomputers listed in the International Supercomputer Conference’s Top500 and Green500More companies are entering the AI and computational market, but they’re also using open hardware or equipment with higher computational abilities in addition to supercomputers.

Therefore, colocations are now at a crossroads between two growth strategies. Some social media companies continue with 5 kW rack densities, but they’re distributing the increasing loads across a large amount of infrastructure as they grow. The other strategy is from the processor level where high density chip clusters are requiring computational density for training AI models. This strategy generates significantly more heat that must be rejected.

Air (hot aisle/cold aisle) and liquid can both provide adequate cooling for a floor of low and medium density racks, especially since the trend is toward higher ambient cooling temperatures without affecting uptime. However, a problem arises when high density 15 to 30-kW server racks are mixed in with mostly low and medium density racks in a data hall. Should the data center hall’s ambient temperature be lowered to accommodate the higher heat generation at a significant operational cost penalty? Or would direct liquid cooling, such as coolant distribution units (CDU) or rear door heat exchangers, efficiently accommodate the high density racks, while leaving the ambient temperature higher for the low and medium density equipment?

While there are plenty of suppliers of equipment such as CDUs, data center operators should instead find turnkey solution providers that offer the CDU, the plant that supplies the CDU and other distribution choices, the additional materials for complete integration into the data hall, the contractor, servicing, warranties and other services leading to single source responsibility.

ANGIE McMILLIN, Vertiv

Angie McMillin: We see the potential for the convergence of two trends to drive growth in rack density in certain applications. One is the introduction of new technologies such as the chips mentioned in the question. The other is the migration of data center capacity closer to users to support private and hybrid cloud applications. These edge deployments will be located in areas where real estate is scarce and more expensive, so there will be an incentive to increase density.

We’re still in the early phases of these trends and what we are seeing today is clusters of high-density racks being integrated into existing data centers. These clusters will serve as proving grounds for the technologies that will enable more widespread deployment of higher density racks. Liquid cooling and hot-scalable power systems will be among the technologies that benefit from these trends.

JUAN FONT, CoreSite

Juan Font:  Just like how customer deployments are growing, they are also increasingly becoming more power-dense. Artificial intelligence, machine learning or GUI-hungry gaming applications, to name a few, are use cases requiring high-performance compute, which we see increasingly placed at the edge. Such is the case with autonomous vehicle platforms or 5G applications requiring ultra-low latency.

Our strategy continues to be to support the widest array of use cases and power densities for applications that would benefit from a rich ecosystem of natively deployed fiber networks and cloud on-ramps in close proximity to large population centers. That said, in a multi-tenant data center environment – which is CoreSite’s predominant model – the key is to build enough flexibility in your design to be able to support high-performance computing or even liquid-cooled environments – but not at the expense of overbuilding your facility.

Data center design is impacted principally three ways: you have to augment cooling, increase the weight bearing of the floor space, as the racks are also heavier, and add more power distribution, as most of the new high-power density equipment also require large breaker sizes and 3-phase power.

TONY BISHOP, Digital Realty

Tony Bishop: The use of AI in the enterprise has accelerated dramatically in the past few years and with it, AI is bringing a significant need to meet new power demands and energy efficiency. Rack density, in particular, will become even more important in meeting power management needs. Colocation providers that can deliver on continuous, reliable energy needs will prove successful as AI applications and use continues to evolve.

But perhaps more important is how we overhaul IT infrastructure to bring infrastructure closer to the data generated by intensive applications like AI. AI workloads have moved to colocation facilities, yet the accrual of data is causing challenges for today’s IT infrastructure. As data accrues, it tends to attract additional services and applications, causing data gravity to emerge, which has the same effect on an enterprise that gravity has on objects on our planet. We often see data gravity as one of the main culprits preventing AI innovation. For example, if enterprises don’t account for data gravity, it can emerge as a big challenge that slows response times, creates information silos and prevents companies from providing excellent customer and employee experiences. To overcome these challenges, enterprises must bring computing resources closer to where data sets lie, which shrinks the time and distance needed to analyze the data that supports AI innovation.

At Digital Realty, we’ve expanded our capabilities in this area. One way we’re doing so is through our partnership with leading chip manufacturer, NVIDIA, to meet the needs of customers by introducing a new AI-ready infrastructure solution that enables them to rapidly deploy AI models in close proximity to their data sets. As a result, enterprises can gain access to AI-ready infrastructure to solve the global coverage, capacity and connectivity needs associated with deploying AI capabilities.

JAIME LEVERTON, iMasons and eStruxture Data Centers.

Jaime Leverton: The average data center rack density is certainly rising and we expect this trend to continue. There is more computing power being packed in smaller form-factor equipment. While this gives us the ability to concentrate more hardware into a smaller footprint, there’s also a greater need for high-density power and cooling. Designing a high-density data center implies bringing cooling closer to the source of heat through high-efficiency cooling units, as well as employing high-density power distribution solutions. As well, hot-aisle containment is a must for high-density racks as it is equally highly efficient and cost-effective.

We also see incorporating smart IoT systems into our design to better measure temperature and humidity and automate and adjust cooling accordingly in real-time. A positive effect of high-density deployments is that we clearly see a decrease in the PUE numbers.

PHILLIP MARANGELLA, EdgeConneX

Phillip Marangella: Deployment densities are definitely on the rise as a result of technologies like AI and things like cloud gaming, for example. Requests for rack densities in excess of 20 KWs per rack are certainly increasing. Being able to adapt and utilize technologies like liquid or immersion cooling will be key to support those densities.

At the same time, doing so, while factoring in sustainability, will be a key challenge as customers and providers look to meet their goals of being carbon neutral.

NEXT: How the pandemic is impacting data center networks and interconnection.

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Courtesy of Park Place Technologies
Courtesy of Park Place Technologies

Immersion or Direct-to-Chip: A Comparison of the Most Common Liquid Cooling Technologies

Which liquid cooling technology is right for your organization? Chris Carreiro, Chief Technology Officer at Park Place Technologies, compares the most common liquid cooling technologies...

White Papers

Mgk Dcf Wp Cover1 2023 01 09 10 34 33

Data Center Microgrids: The Case for Microgrids at Data Centers

Jan. 9, 2023
Many of the systems that businesses and the public rely on in the modern world are dependent on the internet, making data centers a critical form of infrastructure. But as the...