GRC Pushes Density Limits with Support for 200 kW Immersion Racks

May 26, 2021
GRC (Green Revolution Cooling) has launched a new immersion cooling design for extreme-density computing that can support up to 200 kilowatts (kW) of server capacity.

At DCF we continue to closely track trends in IT rack density and its impact on data center cooling systems. That includes the high-performance computing (HPC) sector, which has been the primary venue for extreme density installations of 30 kW per rack and beyond.

GRC (Green Revolution Cooling) has launched a new cooling designed for really extreme density. GRC’s ICEraQ Series 10 immersion cooling module includes a coolant distribution unit (CDU) that can support up to 200 kilowatts (kW) of capacity using a warm water supply, and up to 368 kilowatts with chilled water. The design also allows ICEraQ modules to be positioned end-to-end, fitting snugly against one another so they use less floor space.

GRC was one of the early players in immersion cooling, unveiling its first commercial offering in 2010. The Austin company is using a decade of experience to refine its immersion deployments with the ICEraQ Series 10 design to boost density both inside and outside the module.

“As the next generation of data center immersion cooling solutions, the Series 10 builds on our successful deployments and customer input to improve usability and functionality, with an easy-to-use rack design and a clean aesthetic,” said Peter Poulin, CEO of GRC. “It’s exciting to bring a new form into the market and we look forward to offering this immersion cooling solution to customers struggling with data center cooling challenges.”

Next month, the Series 10 will be deployed at the Texas Advanced Computing Center (TACC), which has been working closely with Austin-based GRC since its launch.  That includes cooling for the GPU-intensive subsystem of the Frontera supercomputer, the ninth fastest supercomputer in the world.

The GRC ICEraQ Series 10 immersion cooling module for data centers. (Image: GRC)

AI, Denser Clouds Boost Immersion Cooling

GRC has been in the forefront of the effort to increase the use of liquid cooling in the data center industry. GRC submerges servers in a tank filled with liquid coolant, rather than using cold air. Servers operate in an enclosure filled with a dielectric fluid similar to mineral oil. They are inserted vertically into slots in the tank, which is filled with  coolant fluid, which transfers heat almost as well as water but doesn’t conduct an electric charge.

This approach offers potential economic benefits by allowing data centers to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers. Last year GRC raised $7 million to accelerate the development of its immersion cooling technology.

The vast majority of data centers continue to cool IT equipment using air, while liquid cooling has been used primarily in HPC. With the emergence of cloud computing and “big data,” more companies are facing data-crunching challenges that resemble those seen by the HPC sector, which could make liquid cooling relevant for a larger pool of data center operators. Microsoft recently began using immersion-cooled servers in production as it seeks to manage rising power densities and heat in its Azure Cloud data centers.

“With companies such as Microsoft adopting liquid immersion cooling for high-density computing applications, our vision of re-imagined data center cooling is further validated,” said Poulin.

Microsoft is using two-phase immersion cooling, in which servers are immersed in a coolant fluid that boils off as the chips generate heat, removing the heat as it changes from liquid to vapor. The vapor then condenses into liquid for reuse, all without a pump. GRC is the leading player in single-phase immersion, in which the coolant fluid removes the heat using a CDU and a water-cooling loop.

The Series 10’s racks have 42U of space for servers and can accommodate up to four PDUs mounted at the rear of the rack. Networking and power connections are accessible by opening the top lid of the tank.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Tackling Utility Project Challenges with Fiberglass Conduit Elbows

Explore how fiberglass conduit elbows tackle utility project challenges like high costs, complex installations, and cable damage. Discover the benefits of durable, cost-efficient...

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Andrius Kaziliunas/Shutterstock.com
Source: Andrius Kaziliunas/Shutterstock.com

Cabling Systems: On the Forefront of the Low Voltage Telecom Revolution

Jose Reyes, Vice President & Co-Owner of Cabling Systems INC, explores the history of low voltage telecom cabling systems.

White Papers

Dcf Sabey Wp Cover2022 06 15 11 50 21 300x233

How Austin’s Emerging Tech Hub Became the New Home for Data Centers

June 16, 2022
Austin, Texas has emerged as a tech mecca as startups and data centers move into the region. Sabey Data Centers explores why.