The Immersion Supercomputer: Extreme Efficiency, Needs No Water

July 6, 2015
A supercomputer immersed it in tanks of liquid coolant? This sci-fi scenario has created a real-world scientific computing powerhouse.

It sounds like science fiction: Take a supercomputer and immerse it in tanks of liquid coolant, which must be kept cool with the use of water. This sci-fi scenario has created a real-world scientific computing powerhouse.

The Vienna Science Cluster uses immersion cooling, dunking SuperMicro servers into a dielectric fluid similar to mineral oil. Servers are inserted vertically into slots in the tank, which is filled with 250 gallons of ElectroSafe fluid, which transfers heat almost as well as water but doesn’t conduct an electric charge.

The system has emerged as one of the world’s most efficient supercomputers, as measured by Power Usage Effectiveness (PUE), the leading metric for the efficiency of data center facilities. The Vienna Science Cluster 3 system touts a mechanical PUE of just 1.02, meaning the cooling system overhead is just 2 percent of the energy delivered to the system. A mechanical PUE doesn’t account for energy loss through the power distribution system, which means the actual PUE would be slightly higher.

The end result: 600 teraflops of computing power uses just 540 kilowatts of power and 1,000 square feet of data hall space.

“We are very impressed by the efficiency achieved with this installation,” said Christiaan Best, CEO and founder of Green Revolution Cooling, which designed the immersion cooling system. “It is particularly impressive given that it uses zero water. We believe this is a first in the industry.”

Why Liquid Cooling Matters

Liquid cooling can offer clear benefits in managing compute density and may also extend the life of components. The vast majority of data centers continue to cool IT equipment using air, while liquid cooling has been used primarily in high-performance computing (HPC). With the emergence of cloud computing and “big data,” more companies are facing data-crunching challenges that resemble those seen by the HPC sector, which could make liquid cooling relevant for a larger pool of data center operators.

Last fall at the SC14 conference, a panel of HPC experts outlined their expectation for a rapid expansion for liquid cooling that may extend beyond its traditional niches. At Data Center Frontier we’ll be tracking this transition, and keeping readers posted on relevant innovations in liquid cooling, such as the water-less implementation in Vienna.

These enclosures house the 2,020 compute nodes of the Vienna Scientific Cluster 3, which are immersed in liquid coolant. (Photo: Green Revolution Cooling)

The Vienna Scientific Cluster combines several efficiency techniques to create a system that is stingy in its use of power, cooling and water.

Water management is a growing priority for the IT industry, as cloud computing is concentrating enormous computing power in server farms supported by cooling towers, where waste water from the data center is cooled, with the heat being removed through evaporation. Most of the water is returned to the data center cooling system, while some is drained out of the system to remove sediment.

The fluid temperature in the immersion tank is maintained by a pump with a heat exchanger, which is usually connected to a standard cooling tower. The Vienna Scientific Cluster uses a closed loop dry cooler as the final method of heat rejection, requiring no water at all. Energy use may rise slightly in the summer, but should still remain near the 1.1 to 1.2 level seen among leading hyperscale data centers.

The novelty of the Vienna design is that it combines a water-less approach with immersion cooling, which has proven effective for cooling high-density server configurations, including high-performance computing clusters for academic computing, seismic imaging for energy companies, and even bitcoin mining.

Breaking the CRAC Habit

While not seen often in today’s enterprise and cloud data centers, liquid cooling isn’t new. If you’ve been around the industry for a few years, you’ll recall the days when water-cooled mainframes were standard in corporate data centers. But that soon shifted to racks of servers cooled by air using the familiar “hot aisle/cold aisle” design seen in most data centers today, with water chilling loops confined to the air handlers and “CRACs” (computer room air conditioners) housed around the perimeters of the data hall.

The alternative is to bring liquids into the server chassis to cool chips and components. This can be done through enclosed systems featuring pipes and plates, or by immersing servers in fluids. Some vendors integrate water cooling into the rear-door of a rack or cabinet.

Immersion takes a different approach, sinking the equipment in liquid to cool the components.

Green Revolution Cooling is seeking to build awareness of the benefits of immersion cooling through displays such as this exhibit at SC14 in New Orleans. (Photo: Rich Miller)

Green Revolution has been in the forefront of the recent resurgence of interest in immersion. In addition to supporting extreme power density, immersion cooling offers potential economic benefits by allowing data centers to operate servers without a raised floor, computer room air conditioning (CRAC) units or chillers. It also eliminates the need for server fans, which can also be power hogs.

The VSC-3, was installed in 2014, with Green Revolution Cooling working with Intel, ClusterVision, and Supermicro. It supersedes the VSC-2 cluster, which used a rear-door cooling solution that achieved a mechanical PUE of 1.18. VSC-3 features 2,020 compute nodes, each with 16 processor cores housed in the CarnotJet tanks.

The Cost Component of Cooling

Liquid cooling often requires higher up-front costs, which can be offset by savings over the life of a project. Economics were a key driver for the Vienna design.

“The value proposition (of the GRC system) was extremely impressive,” said Christopher Huggins, Commercial Director at ClusterVision, a leading European HPC specialist. “The whole data center and cluster was far less expensive than it would have been with any other cooling solution on the market. We are certain we will be using the GRC solution on more projects in the future.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Optimizing AI Infrastructure: The Critical Role of Liquid Cooling

In this executive brief, we discuss the growing need for liquid cooling in data centers due to the increasing power demands of AI and high-performance computing. Discover how ...

AI-Driven Data Centers: Revolutionizing Decarbonization Strategies

AI hype has put data centers in the spotlight, sparking concerns over energy use—but they’re also key to a greener future. With renewable power and cutting-edge cooling, data ...

Bending the Energy Curve: Decoupling Digitalization Trends from Data Center Energy Growth

After a decade of stability, data center energy consumption is now set to surge—but can we change the trajectory? Discover how small efficiency gains could cut energy growth by...

AI Reference Designs to Enable Adoption: A Collaboration Between Schneider Electric and NVIDIA

Traditional data center power, cooling, and racks aren’t sufficient for GPU-based servers arranged in high-density AI clusters...

Bloom Energy
Source: Bloom Energy

AI and Data Center Energy Demands: Are Fuel Cells the Answer?

Razvan Panati and Kaushal Biligiri of Bloom Energy explain why AI-driven data centers should consider on-site fuel cells to support their evolving energy needs.

White Papers

Dcf Venyu Wp Cover 2021 07 12 7 15 51 233x300

The Business Case for Data Center Geo Diversity

July 13, 2022
Geo diversity, or shortening the distance that your data travels, will allow you to reaching your user bases more effectively, and create better customer experiences. This white...