Equinix Metal Hopes to Adopt Liquid Cooling Through Open19

Oct. 7, 2021
Equinix has been testing the use of liquid cooling in its data centers, and hopes to use the technology in its Equinix Metal service to create a high-density, energy efficient computing platform. The initiative will not impact current colocation customers.

Equinix has been testing the use of liquid cooling in its data centers, and hopes to use the technology in its Equinix Metal service to create a high-density, energy efficient computing platform.

Like most data centers, Equinix uses air to cool the servers in its data centers, where it leases space to customers. Liquid cooling is used primarily in technical computing, enabling the use of more powerful hardware that uses more energy.

Those servers will soon be coming to more data centers are will be needed to support growing workloads. The company sees a potential breakthrough in a new liquid cooling design from open hardware consortium Open19that could support several different types of liquid cooling.

Equinix has been operating liquid cooling labs for over a year and has “significant experience with various technologies,” said Zac Smith, the Managing Director for Equinix Metal, outlined the case for new approaches in a blog post title The Liquid Cooling Imperative. “We have an opportunity to start a network effect for liquid cooling, and it starts with us bringing liquid cooling into our facilities in a scalable way with Open19,” said Smith. “As Equinix Metal, we hope to be the ‘anchor tenant’ for liquid cooling inside of our data centers.”

Equinix colocation customers won’t be affected by Metal service’s planned use of liquid cooling. Colocation customers control their own hardware, with Equinix providing space in cabinets and cages where clients can install equipment.

Metal is slightly different, as it is a cloud-style service where Equinix manages the hardware and leases capacity on bare metal servers, which can be provisioned through a web interface. It’s an approach that lets users rapidly deploy physical servers with Equinix and easily connect them with cloud platforms to create hybrid IT configurations. Equinix manages the physical environment, including the cooling.

Powerful Processors Push the Envelope on Cooling

The Equinix announcement is notable because the company is the largest provider of colocation and interconnection services, with 10,000 customers and 229 data centers across the globe. Interest in liquid cooling was boosted by the recent news that Microsoft is using immersion-cooled servers in production, and believes the technology can help boost the capacity and efficiency of its cloud data centers.

In exploring liquid cooling, both Equinix and Microsoft are responding on the growing adoption of artificial intelligence (AI) and other high-density applications, which pose challenges for data center design and management. Powerful new hardware for AI workloads is packing more computing power into each piece of equipment, boosting the power density – the amount of electricity used by servers and storage in a rack or cabinet – and the accompanying heat.

“All signals suggest that over the next several years, our industry will be defined by power hungry computers that have bigger chips packed with smaller nanometers, more cores, bigger dies (chiplets anybody?), faster memory, smarter NIC’s and tons of accelerators,” Smith writes.

“Today the most amount of power we can put into one of our Open19 server ‘bricks’ is 400 watts, and we need to drive that number much higher to power the silicon that is coming down the pike.”
This creates a sustainability challenge. Like most data center operators, Equinix is focused on climate change and reducing the impact of its operations on the environment. More hot processors require more cooling, and both consume more energy.

This is driving Equinix’s interest in liquid cooling, Smith said.

Open19 Approach Offers An Additional Way Forward

In addition to leading Equinix Metal, Smith is President of the Open19 Foundation, an open hardware initiative focused on edge and enterprise computing. It was formed in 2017 as alternative to the Open Compute Project founded Facebook, which has created a hardware ecosystem focused on hyperscale users.

Open 19 works to create servers and racks that are simple to maintain, using cable-free installation. Smith says the group is close to finalizing a design that can apply the same approach to liquid cooling.

“An Open19 working group has designed and proposed a new ‘plug and play’ blind-mate coupler for liquid cooling systems,” Smith said. “Executed thoughtfully, we believe this design can support all the major liquid cooling technologies, including immersive, single phase and dual phase — all while maintaining a common standard that would bring economic and adoption benefits.”

Smith said adoption of liquid cooling has been complicated by the array of competing approaches, which include bringing liquid to the server chassis and chip-level as well as immersion strategies that place servers in tanks of coolant – an approach will is also divided into camps for single-phase or dual-phase cooling.

“The complexity of deploying these two pieces of the puzzle together has limited liquid cooling adoption to at-scale users, but we think Open19’s new blind mate approach is one way of dramatically lowering this barrier to entry.”

Liquid Cooling Beyond Cloud and HPC?

A number of cooling specialist are participating in Open19, including Vertiv, Zutacore, Delta, Submer and Schneider Electric. Smith argues that standardization through Open19 and similar approaches – there are similar initiatives underway at the Open Compute Project and Intel – can help build a more efficient ecosystem for liquid cooling.

“If the colocation industry can support the mechanical side of liquid cooling (as well as the practical issues of training, certification, regulation, maintenance) then I think server manufacturers will have confidence to create liquid-capable solutions that can be deployed in a wide variety of data centers,” said Smith. “I call it solving the chicken and egg problem for liquid cooling, and by doing this work in the open with the Linux Foundation, I’m hopeful that we can spark an industry-wide movement.”

Although the Equinix liquid cooling initiative is focused on Metal, Smith hopes that it can “pave the way for other customers who build their own infrastructure and might have a preference as to how they want their equipment cooled,” he said. “That’s why Open19’s goal of blind-mate, leak-free connectors and manifolds is so important. From single-phase and two-phase direct-to-chip cooling to immersion and air assist, customers will be able to cool what and how they want.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Getty Images, courtesy of Schneider Electric
Source: Getty Images, courtesy of Schneider Electric

Minimizing Cyber Risk and Doing it in a Simplified Way – the Key to Secure IT Infrastructure Success

Kevin Brown, SVP EcoStruxure Solutions, Secure Power for Schneider Electric, outlines two major areas of focus when it comes to cybersecurity.

Nuttapong punna/Shutterstock.com
Source: Nuttapong punna/Shutterstock.com
Gorodenkoff/Shutterstock.com
Source: Gorodenkoff/Shutterstock.com
PeopleImages.com - Yuri A/Shutterstock.com
Source: PeopleImages.com - Yuri A/Shutterstock.com

White Papers

Mgk Dcf Wp Cover2 2023 01 09 10 34 33

Data Center Microgrids: Planning for Your Microgrid

Jan. 9, 2023
The energy grid is increasingly vulnerable to outages thanks to aging infrastructure and the growing impact of climate change. Traditionally, data centers have turned to uninterruptible...