Executive Roundtable: Adapting Data Center Infrastructure for the Age of AI
At Data Center Frontier, we turn to the experts to help make sense of rapid change in the industry, and few forces are driving more upheaval today than artificial intelligence. As AI accelerates a fundamental shift in infrastructure priorities, how are industry leaders adapting?
To begin our Executive Roundtable for the Second Quarter of 2025, we’ve gathered insights from our distinguished executive panel on how AI is redefining the core of digital infrastructure. From power and cooling to site design, companies are reshaping their approach to stay ahead.
And so to begin our series of conversations for the Q2, we start today with a timely and pressing question: How is your organization responding to the infrastructure demands of AI? And what innovations or strategic moves has your organization made to meet the rising demands of next-gen AI workloads?
In addition to today's discussion, in articles throughout the rest of this week and into the next, our panel of executive thought leaders will offer their observations on other topical data center industry considerations for the Second Quarter, including:
- Building Fast, Building Smart: Strategies for accelerating delivery without compromising resiliency or sustainability. Our panelists share how they're navigating compressed timelines through modular design, supply chain agility, and trusted partnerships.
- Standard vs. Custom: We ask how sector leaders are balancing repeatable designs with AI-driven, client-specific demands. Discover where standardization brings speed and efficiency, and where customization delivers critical performance advantages.
- Real Innovation, Not Hype: We ask what truly transformative breakthroughs look like in today’s data center landscape. From power architecture to AI-optimized layouts, we explore which innovations are moving the needle - and which are just buzzwords.
The seasoned data center industry leaders of our Executive Roundtable for the Second Quarter of 2025 include:
- Jason Waxman, CEO, CoolIT Systems
- Phillip Marangella, Chief Marketing and Product Officer, EdgeConneX
- Nicole Dierksheide, Global Category Director for Large Power, Rehlko
- Carsten Baumann, Director, Strategic Initiatives and Solution Architect, Schneider Electric
From power density and advanced cooling to site layout and deployment speed, AI is rewriting the playbook for data center infrastructure. Read on to learn how our panelists' firms are adapting in real time in our first Executive Roundtable question for Q2 of 2025.
Data Center Frontier: How is your organization adapting to the infrastructure demands of artificial intelligence? As AI continues to redefine the design priorities for power, cooling, and site architecture, what specific innovations or strategic adaptations has your company implemented over the past year to meet the performance and deployment requirements of next-generation AI workloads?
Jason Waxman, CoolIT: In addition to the sheer demand for AI infrastructure which appears insatiable, we are seeing a power requirements and densities that were unthinkable just a few years ago.
AI GPUs now exceed 1,000 W thermal design power (TDP), and AI rack densities are being planned that would exceed 1 MW. In contrast, just a few years ago, 200-300 W chips and 12 kW racks were considered high density.
CoolIT has responded to the extraordinary performance and market demands stemming from AI in two ways:
First, we’ve worked with customers and partners to design liquid cooling solutions to meet their needs. Because the pace of change and innovation for customers has accelerated, we have invested significantly in engineering, rapid prototyping and design tools.
Second, we’ve dramatically ramped our production capacity to meet the massive demand for liquid cooling. Given the dynamic nature of the supply chain, our ability to scale manufacturing and support is as important as our continued product innovation.
We are growing our production capacity in North America and Asia. We opened a new manufacturing site in Calgary, Canada, which increased our North American manufacturing capacity by 25 times. We’ve also strengthened our global supply chain, including pursuing dual-sourcing of critical components and supplier geographic diversification strategies.
Phillip Marangella, EdgeConneX: AI has completely transformed the way we need to build and operate data centers. It’s essentially an entirely new product based on how we power and cool facilities.
As rack densities rapidly scale past the triple digits and we approach a megawatt per rack in the near future, there’s a need to think very differently about data center capacity.
That is why we have developed Ingenuity, our AI-enabled data center product offering, which is both flexible and scalable to meet today’s requirements as well as tomorrow’s AI infrastructure needs.
This ensures that the data center isn’t obsolete by the time it is built.
More importantly, as we transition from air to liquid cooling, we’ve also completely reconfigured our operational procedures, training, and readiness to ensure a safe, secure, and sustainable environment for AI/HPC deployments.
Nicole Dierksheide, Rehlko: One of the biggest shifts we're seeing with AI infrastructure is how it impacts power consumption and load behavior. In particular, how AI servers draw power from the grid or backup systems.
Traditionally, in most backup power scenarios, like a facility supported by diesel generators, loads are relatively steady. If power is lost, generators kick in and carry the load in phases: 25%, 50%, then 100%. The load behavior is typically predictable and step-based.
With AI, however, we're seeing what's known as microloading. AI servers constantly fluctuate their power demands, often in milliseconds. These are small but rapid changes that don't add up to a full load step, like going from 0 to 3 megawatts, but they still cause the generator to respond. As these micro-loads ramp up or down, the generator's alternator must adjust quickly to maintain proper speed and voltage, similar to how a car engine reacts to acceleration and deceleration.
These rapid fluctuations can challenge the generator’s ability to maintain stable output and power quality. That's where our focus has shifted at Rehlko. We ensure our generators have advanced voltage regulation and fast-reacting control systems that can handle these unpredictable, high-frequency changes in load.
Our goal is to deliver consistent, reliable power to the data center even as AI workloads continue to evolve and put new demands on infrastructure.
Carsten Baumann, Schneider Electric: The current boom in AI is creating significant opportunities for the entire data center ecosystem. Demand growth projections indicate an increase of well above 100 GW in the U.S. by 2030, which implies more than doubling existing infrastructure.
Economically viable solutions to challenges such as power generation, load flexibility, transmission and distribution systems, and efficiencies in the design, deployment, and operations of data center assets are necessary. Better demand planning is required to align the supply chain and manufacturing capacities responsibly. Coordination among stakeholders to create system synergies rather than isolated improvements is essential. Innovation and a shift in thinking and operations are needed.
AI can be utilized to optimize the planning, designs, and operations of new data center capacities. At Schneider Electric, efforts include using AI tools to model energy demand, generation, reliability, and availability. Manufacturing capacity is being increased, and hardware components are being made more intelligent with embedded sensors and software. At Schneider Electric, we have implemented AI in our Smart Manufacturing processes to enhance productivity, quality and the optimization of energy use.
From a product perspective, digital smart products enable system optimization and flexibility beyond static peak load designs. Overprovisioning is becoming obsolete, and smart energy management and automation will ensure reliable energy access to support the world’s digital transformation.
We are at the forefront of the digital revolution. Our customers face unprecedented challenges. Building the necessary infrastructure in creative and new ways forces us to accelerate innovation. For example, as an industry member of the EPRI FLEX program we are driving innovation and industry collaboration to collectively solve these massive power and cooling demands.
According to the North American Electric Reliability Corporation (NERC), winter peak load is projected to rise from 694 GW in 2024 to 843 GW by 2034—a 21.5% increase that marks one of the most significant surges in modern grid history. The Federal Energy Regulatory Commission (FERC) offers an even more aggressive outlook, forecasting 128 GW of peak load growth by 2029, driven largely by the rapid expansion of commercial and industrial electricity demand.
Consequently, we educate policymakers to enable an environment that supports innovation, accelerate on-site power generation that supports energy dominance, reduce red tape to improve interconnection ques, and to create new incentives to embrace data center load flexibility. With the acquisition of Motivair, we can address our customers’ advanced liquid cooling and thermal management requirements.
NEXT: Speed-to-Market Is a Competitive Edge -- What’s Your Playbook?
About the Author
Matt Vincent
A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.