Powering AI Innovation: The Role of Data Centers in Supporting AI Workloads

March 28, 2025
As AI continues to transform industries and drive business innovation, the role of data centers in supporting this revolution cannot be overstated. Doug Adams, CEO of NTT Global Data Centers explores a new approach to data center design and operation for the AI era.

As enterprises increasingly embrace AI technologies, the choice of a data center partner becomes a critical factor in their success. The landscape of requirements for AI deployments is evolving rapidly, demanding a new approach to data center design and operation.

At the heart of this shift lies the need for advanced and flexible cooling solutions. The high-density computing requirements of AI workloads generate unprecedented levels of heat, pushing traditional air-cooling methods to their limits. Innovative cooling technologies, particularly liquid cooling and immersion cooling have emerged as essential components for managing the thermal loads of AI infrastructure. These solutions not only handle the intense heat produced by AI accelerators and high-performance processors but also enable higher computing density and improved energy efficiency.

However, cooling is just one piece of the puzzle. The unpredictable nature of AI workloads necessitates a data center infrastructure that can scale rapidly and flexibly. Enterprises need partners who can quickly deploy additional capacity and support high-density rack configurations, adapting to sudden spikes in computing power and storage needs. This scalability must be matched with ample power capacity, as AI technologies are notoriously power-hungry. Yet, in an era of increasing environmental consciousness, this power must be delivered efficiently. Data centers supporting AI workloads are thus exploring innovative approaches to energy management, including the implementation of energy-efficient cooling systems and the integration of renewable energy sources.

Key Considerations for AI Deployments

Connectivity and low latency form another crucial aspect of AI infrastructure. Many AI applications demand real-time processing, making robust network infrastructure with high-bandwidth, low-latency connections indispensable.  In training clusters spread across a data hall, low latency is also essential for efficient synchronization between nodes, reducing idle times and enhancing overall training performance. Similarly, for inference-AI applications where query responses are time-sensitive, such as in financial trading or autonomous vehicles, low latency ensures rapid data processing and decision-making. Whether for distributed training or real-time inference, a high-performance network with minimal latency is key to maximizing AI system capabilities and responsiveness. Therefore, data center partners must offer connectivity solutions that ensure optimal performance for these time-sensitive AI workloads.

Given the often sensitive and proprietary nature of AI data and models, combined with the sheer cost of AI-ready infrastructure, security cannot be an afterthought. Comprehensive physical and cybersecurity protocols, coupled with compliance with relevant industry standards and regulations, are non-negotiable features of any data center supporting AI deployments.

The rapidly evolving landscape of AI technologies requires data center providers to possess not just infrastructure but expertise. Partners with experience in hosting AI workloads can offer valuable insights, guiding enterprises in optimizing their deployments and navigating the unique challenges posed by AI infrastructure. This expertise extends to supporting a diverse array of AI hardware, from GPUs and TPUs to custom AI accelerators, each with its specific power, cooling, and connectivity requirements.

Innovation and Sustainability in AI Infrastructure

Perhaps most importantly, the pace of innovation in AI technologies demands a collaborative approach from data center providers. The most successful partnerships are those where providers work closely with their clients to develop tailored solutions for AI deployments. This collaboration has been particularly fruitful in the realm of cooling technologies. By engaging directly with clients and technology partners, data centers can accelerate the development and implementation of innovative cooling solutions, ensuring they remain at the forefront of supporting AI infrastructure.

This collaborative spirit extends to sustainability efforts as well. As AI workloads consume significant energy, data center partners are not just implementing energy-efficient technologies but are also exploring creative solutions to reduce water usage and carbon emissions. Many – particularly in EMEA, but expected in other regions as well – are investigating ways to repurpose the excess heat generated by AI hardware, contributing to broader sustainability initiatives.

For enterprises looking to deploy AI technologies, the choice of a data center partner is more than a matter of capacity and cost. It requires a holistic evaluation of the provider's ability to offer advanced cooling solutions, scalable and flexible infrastructure, robust power and connectivity, strong security measures, and deep expertise in AI workloads. The ideal partner brings not just infrastructure but a collaborative approach to innovation, working hand-in-hand with clients to address the unique challenges of AI deployments.

As AI continues to transform industries and drive business innovation, the role of data centers in supporting this revolution cannot be overstated. By carefully considering these factors and choosing a partner that aligns with their AI ambitions, enterprises can ensure they have the solid foundation needed to harness the full potential of AI technologies. The future of AI is being built today, and it's being built in data centers that are as innovative and adaptable as the technologies they support.

About the Author

Doug Adams

Doug Adams is President and CEO, Global Data Centers for NTT DATA.

Sponsored Recommendations

Optimizing AI Infrastructure: The Critical Role of Liquid Cooling

In this executive brief, we discuss the growing need for liquid cooling in data centers due to the increasing power demands of AI and high-performance computing. Discover how ...

AI-Driven Data Centers: Revolutionizing Decarbonization Strategies

AI hype has put data centers in the spotlight, sparking concerns over energy use—but they’re also key to a greener future. With renewable power and cutting-edge cooling, data ...

Bending the Energy Curve: Decoupling Digitalization Trends from Data Center Energy Growth

After a decade of stability, data center energy consumption is now set to surge—but can we change the trajectory? Discover how small efficiency gains could cut energy growth by...

AI Reference Designs to Enable Adoption: A Collaboration Between Schneider Electric and NVIDIA

Traditional data center power, cooling, and racks aren’t sufficient for GPU-based servers arranged in high-density AI clusters...

Legrand
Source: Legrand

Building Data Centers for Tomorrow’s AI Era

Bob Crain, director marketing/product development at Cablofil, explains the crucial role electrical contractor play in ensuring that data centers can meet operational demands ...

White Papers

Dcf Afl Sr Cover 2023 01 09 12 22 13

How Four Efficiency and Density Points are Reshaping Data Center Network Architecture

Jan. 9, 2023
In a world connected by digital infrastructure, new network considerations must be taken to ensure optimal performance, efficiency, and sustainability