Podcast: Data Center AI, Power, Cooling and Digital Twins Talk with Cadence Distinguished Engineer Mark Seymour

Aug. 27, 2024
Join us for a discussion of data center power demand and where it's going in the context of rapid digitalization and exponential growth of HPC and AI computing needs. The conversation also highlights the importance of digital twins for managing data center efficiency and the advantages of liquid cooling technologies.

For this episode of the Data Center Frontier Show podcast, we welcome Mark Seymour, Distinguished Engineer with Cadence Design Systems, for a discussion of the big question on everyone’s mind right now in this industry: data center power demand and where it's going in the context of rapid digitalization and exponential growth of HPC and AI computing needs, and how that compares and contrasts, or even conflicts, with increasing environmental concerns and regulations. 

The conversation also highlights the importance of digital twins for managing data center efficiency and the advantages of liquid cooling technology, and particularly immersion cooling, as a sustainable alternative to traditional methods. In the course of our interview, Seymour also emphasizes the data center industry's responsiveness to societal demands for sustainability, citing initiatives such as ubiquitous tree planting by project developers, and the need to adapt to new technological challenges.

Here's a timeline of the podcast's key moments:

2:59 - Seymour explains that AI is essentially high-performance computing, which is now required in many data centers that previously did not need it.

12:05 - Addressing the challenges and potential of immersion cooling technology: Emphasizing its growing acceptance, but also the need for confidence in its operation.

17:52 - Talk turns to the importance of digital twins in ultimately managing data center efficiency, with Seymour highlighting the necessity for understanding the interrelated behaviors of IT infrastructure and cooling systems.

24:18 - Discussion circles back to immersion cooling as a sustainable option for data centers, with Seymour expounding on its advantages over traditional cooling methods.

27:44 - Seymour elaborates on the improvements in compute efficiency per watt in modern systems, arguing that the data center industry is responding and adapting to societal demands, rather than being inherently unsustainable.

30:42 - Seymour acknowledges the industry's focus on sustainability and environmental impact, citing examples such Cadence's tree planting initiatives and the ongoing challenge of meeting new technological demands.

Recent DCF Show Podcast Episodes

 

 

Did you like this episode? Be sure to subscribe to the Data Center Frontier show at Podbean to receive future episodes on your app. 

 

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

Sponsored Recommendations

Optimizing AI Infrastructure: The Critical Role of Liquid Cooling

In this executive brief, we discuss the growing need for liquid cooling in data centers due to the increasing power demands of AI and high-performance computing. Discover how ...

AI-Driven Data Centers: Revolutionizing Decarbonization Strategies

AI hype has put data centers in the spotlight, sparking concerns over energy use—but they’re also key to a greener future. With renewable power and cutting-edge cooling, data ...

Bending the Energy Curve: Decoupling Digitalization Trends from Data Center Energy Growth

After a decade of stability, data center energy consumption is now set to surge—but can we change the trajectory? Discover how small efficiency gains could cut energy growth by...

AI Reference Designs to Enable Adoption: A Collaboration Between Schneider Electric and NVIDIA

Traditional data center power, cooling, and racks aren’t sufficient for GPU-based servers arranged in high-density AI clusters...

Courtesy of AFL
Source: Courtesy of AFL

Scaling Up and Scaling Out in AI Data Centers

Manja Thessin, Enterprise Market Manager for AFL, highlights the importance of industry collaboration across factors such as AI hardware innovation and modular infrastructure ...

White Papers

Dcf Opus Wp Cover 2022 04 25 10 14 50 300x235

Hybrid and Multi-Cloud IT Executive Buyer’s Guide

April 27, 2022
Opus:Interactive outlines best practices and provides checklists for assessing your hybrid and multi-cloud needs.