Deep Diving on DeepSeek: AI Disruption and the Future of Liquid Cooling
We know that the data center industry is currently undergoing a period of rapid transformation, driven by the increasing demands of artificial intelligence (AI) workloads and evolving cooling technologies. And it appears that the recent emergence of DeepSeek, a Chinese AI startup, alongside supply chain issues for NVIDIA’s next-generation GB200 AI chips, may be prompting data center operators to reconsider their cooling strategies.
Angela Taylor, Chief of Staff at LiquidStack, provided insights to Data Center Frontier on these developments, outlining potential shifts in the industry and the future of liquid cooling adoption.
DeepSeek’s Market Entry and Supply Chain Disruptions
Taylor told DCF, "DeepSeek’s entry into the market, combined with NVIDIA’s GB200 supply chain delays, is giving data center operators a lot to think about."
At issue here is how DeepSeek's R1 chatbot came out of the box positioned an energy-efficient AI model that reportedly requires significantly less power than many of its competitors. This development raises questions about whether current data center cooling infrastructures are adequate, particularly as AI workloads become more specialized and diverse.
At the same time, NVIDIA’s highly anticipated GB200 NVL72 AI servers, designed to handle next-generation AI workloads, are reportedly facing supply chain bottlenecks. Advanced design requirements, particularly for high-bandwidth memory (HBM) and power-efficient cooling systems, have delayed shipments, with peak availability now expected between Q2 and Q3 of 2025.
This combination of a new AI player and delayed hardware supply has created uncertainty, compelling data center operators to reconsider their near-term cooling infrastructure investments.
A Temporary Slowdown in AI Data Center Retrofits?
Taylor also observed, "We may see a short-term slowdown in AI data center retrofits as operators assess whether air cooling can now meet their needs."
The efficiency of DeepSeek’s AI models suggests that some AI workloads may require less power and generate less heat, making air cooling a viable option in the short term. As a result, there's a chance some operators may hold off on retrofitting existing data centers with liquid cooling solutions until they better understand whether air cooling can meet their evolving AI infrastructure needs.
However, any hesitation of this sort probably does not signal a long-term shift away from liquid cooling. Instead, it may reflect a moment of reassessment, as operators weigh the performance and efficiency of AI models that differ significantly in their hardware and thermal demands.
Expanding the AI Ecosystem: More Players, More Inference Workloads
Taylor also noted, "DeepSeek is also being predicted to open the door for more AI players. If this happens, it’s likely workload demands will shift from training to inference."
DeepSeek’s market entry suggests that AI development is moving toward a more diversified ecosystem, with new players emerging to challenge the dominance of firms like OpenAI and Google DeepMind.
If DeepSeek’s presence fosters further competition, AI workloads will be even more likely to see a shift from model training—an extremely power-intensive process—to inference, which requires sustained but less extreme computational power.
Inference workloads typically run at the edge or in distributed data center environments rather than in hyperscale facilities, potentially altering cooling requirements. While training workloads necessitate dense, high-performance cooling, inference deployments may drive increased demand for energy-efficient cooling at smaller data center sites.
Purpose-Built AI Data Centers Will Continue to Advance
Taylor added, "While liquid cooling adoption in retrofits might take a temporary pause, purpose-built AI data centers designed for high-performance workloads will continue to move forward."
Despite the potential slowdown in retrofitting older facilities with liquid cooling, it's safe to say that at the present moment purpose-built AI data centers remain on track for expansion.
DeepSeek or no DeepSeek, companies investing in next-generation AI applications are still going to require high-density compute environments that push the limits of traditional cooling methods.
And the scale and complexity of future AI workloads will demand purpose-built facilities capable of integrating advanced liquid cooling solutions from the outset.
The Long-Term Case for Liquid Cooling
To wit, LiquidStack's Taylor also asserted, "In the long run, liquid cooling is here to stay—operators know they need to future-proof as AI models keep evolving in ways that can’t be fully predicted."
While operators are currently assessing their options, the industry's long-term trend points decisively toward more liquid cooling. However, the rapid evolution of AI models and workloads introduces an element of unpredictability that makes future-proofing essential.
Most data center operators recognize that traditional air cooling may not be sufficient to handle next-generation AI models, especially those requiring extreme densities and sustained compute performance. DeepSeek doesn't really change that outlook.
That's why we'll continue to look for industry players such as LiquidStack, Submer, and Iceotope among a host of others to continue to innovate in immersion and direct-to-chip liquid cooling solutions, reinforcing the expectation that these technologies will become standard in high-performance AI data centers.
Liquid Cooling at the Edge: A Growing Trend
Wrapping up our talk, Taylor also told DCF, "We’re also likely to see more interest in liquid cooling at the edge, where inference workloads will drive low-latency, high-density compute environments."
As AI inference workloads shift toward edge computing environments—where low latency and high-density processing are critical—liquid cooling adoption at the edge is undoubtedly poised for growth.
Unlike hyperscale AI training centers, edge deployments require compact, energy-efficient cooling solutions that can handle high-performance workloads in constrained spaces.
Emerging technologies, such as micro-modular liquid-cooled data centers, are being developed to support edge computing needs. These systems allow for efficient cooling in remote and distributed locations, ensuring that AI-powered applications can run smoothly while maintaining energy efficiency.
Conclusion: AI Disruption Drives a New Cooling Paradigm
So we've established that the interplay between AI advancements, supply chain fluctuations, and evolving cooling needs is reshaping the data center industry. And while some operators may delay liquid cooling retrofits in response to shifting workload requirements, purpose-built AI data centers and edge computing deployments will continue advancing liquid cooling adoption.
Meanwhile, DeepSeek’s emergence and NVIDIA’s supply chain challenges highlight the dynamic nature of AI infrastructure planning. Data center operators must remain agile, balancing short-term considerations with the need to invest in future-proof cooling solutions.
And liquid cooling remains a critical, if not the main, technology for handling high-density workloads, ensuring that AI’s evolution does not outpace the industry’s ability to support it. In the long term, as AI applications continue to push the boundaries of computational intensity, liquid cooling will most likely remain a key, if not the central, component of next-generation data center strategies.
At Data Center Frontier, we not only talk the industry talk, we walk the industry walk. In that spirit, DCF Staff members may occasionally employ AI tools to assist with editorial content. This article was created with help from Open AI's GPT-4.
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.
Matt Vincent
A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.