ai_stock_image

Navigating Liquid Cooling Architectures for Data Centers with AI Workloads

Feb. 9, 2024
An increasing number of servers require liquid cooling systems to support AI workloads.

Many AI servers with accelerators (e.g., GPUs) used for training LLMs (large language models) and inference workloads, generate enough heat to necessitate liquid cooling. These servers are equipped with input and output piping and require an ecosystem of manifolds, CDUs (cooling distribution) and outdoor heat rejection. There are six common heat rejection architectures for liquid cooling where we provide guidance on selecting the best one for your AI servers or cluster.

Download now to learn more

 

 

This content is sponsored by: