The Density Debate: Is Cooling Door Adoption a Sign of Coming Shift?

Dec. 17, 2015
Does an uptick in adoption of water-chilled cooling doors signal a trend towards higher rack densities in data centers? Colovore and LinkedIn are among those implementing rear-door cooling units.

SANTA CLARA, Calif. – Density is coming to the data center. But thus far, it’s been taking its time.

Over the past decade, there have been numerous predictions of the imminent arrival of higher rack power densities. Yet extreme densities remain limited, primarily seen in high performance computing (HPC) and specialty processing such as bitcoin mining.

The team from Colovore believes data centers will be denser, and the shift will accelerate over the next few years. The colocation specialist believes it is on the front edge of a broader move to denser server cabinets, driven in part by a generational change in IT teams.

Traditional data hall designs will struggle to cool these higher densities. That’s why Colovore is filling its data center in Santa Clara with high-density racks featuring water-chilled rear-door cooling units.

“When we came to market last year, people weren’t buying the density yet,” said Sean Holzknecht, the co-founder and President of Colovore.” This year, everyone’s base requirement is 10kW at a minimum. It’s really flipped.”

Colovore is not alone in adopting rear-door heat exchangers at scale. LinkedIn is implementing a new data center design featuring rear-door cooling units for its new facility near Portland, Oregon. The company said its next-generation design will use “cabinet-level heat rejection” that will double the cabinet densities from its previous data center builds. The new data center will be hosted by Infomart, where LinkedIn has reportedly leased 8 megawatts of space.[clickToTweet tweet=”LinkedIn is implementing water-chilled cooling doors at its new data center near Portland.” quote=”LinkedIn is implementing water-chilled cooling doors at its new data center near Portland.”]

As the rear-door cooling unit gains traction in both the colocation and hyperscale markets, some see it as one of several portents that rack power densities are finally starting to edge higher. It’s a trend that offers both challenges and opportunities for data center operators, both of which are beginning to drive new data center designs like those at Colovore and LinkedIn.

Healthy Appetite for “Headroom”

How long has the data center industry been talking about the arrival of higher densities? My first story on the topic dates to 2002, when cooling vendors demonstrated water-cooled cabinets at a meeting of the 7×24 Exchange, and predicted a “new paradigm” in cooling. Some predictions, such as an industry norm of 150 watts per square foot by 2005, were premature. Others, like the shift to measuring density in watts per cabinet rather than watts per square foot, came to pass fairly quickly.

If density is a long-awaited problem, it’s also one that data center customers have been bracing for in their capacity planning, seeking “headroom” for denser workloads and often provisioning more cooling than they are likely to need.

“The reality is that everyone says they want 200 to 250 watts per square foot, but almost nobody’s using it,” said Jeff Burges, President and founder of colocation specialist DataSite. “There will be some high density users, but also a lot of low density users.”[clickToTweet tweet=”Jeff Burges: Everyone says they want 200 to 250 watts per SF, but almost nobody’s using it.” quote=”Jeff Burges: Everyone says they want 200 to 250 watts per SF, but almost nobody’s using it.”]

The typical enterprise data center user is probably running densities of 3kW to 5kW per rack, according to Shawn Conaway, Director of Cloud Services at FIS. “I see going on more often is pockets of high-density workloads, especially in internal private cloud, where you can see 10 to 15 kW,” said Conaway. “I think we’ll see more of this.”

Conaway said his firm, which specializes in IT solutions for the financial services industry, runs its own racks at 15kW to 30kW a cabinet. But there are those who are testing the boundaries of even higher densities.

What Density Means for Design

“We are pushing 50kW a rack,” said Richard Donaldson, the Director of Infrastructure Management and operations at eBay. “We’re on our fifth generation server design, which means you have to have a data center that can support that. The technologies are shifting. We’re driving towards density because we’re driving toward the lowest cost.”

This data hall inside the eBay data center in Phoenix uses Motivair cooling doors. And also cool blue lighting. (Photo: eBay)

eBay was among the first companies to use water-chilled rear door cooling units at scale. In 2014, the company switched from in-row air cooling units to water-chilled rear door cooling units in a data hall in its Phoenix data center. The room housed 16 rows of racks, each housing 30kW to 35kW of servers, which required the use of six in-row air handlers in each row. That meant eBay had to sacrifice six rack positions for cooling gear. Switching to rear-door units from Motivair allowed eBay to recapture those six racks and fill them with servers, boosting compute capacity within the same footprint.

“I would argue that given where the technology is headed, we’re going to be seeing more density,” said Donaldson. “We’re now seeing densities shift from 1kW per rack to 5 kW a rack. That trend is coming. We’re already seeing it in Equinix and Digital Realty.”

That trend is being felt by data center developer DuPont Fabros Technology, which is updating its data center design to accommodate a wider range of rack densities.

“In our portfolio, we see growing gap between our 3kW (per cabinet) customers and our 15kW customers,” said Scott Davis, Executive Vice President for Data Center Operations at DuPont Fabros. “In our latest product we have the ability to increase the density up to about 300 watts per square foot.”

DuPont Fabros and its cohorts in the multi-tenant data center space are adapting to increasingly variable requirements in both power density and reliability. For some service providers, higher densities offer the opportunity to specialize and differentiate.

A Look Inside Colovore

As you walk into Colovore’s data center, you get a sense of what a new paradigm in cooling might look like. The noise and air movement felt in many data centers is far more subdued.

The cooling doors allow several significant changes in data center design. They attach to the back of a cabinet, and use the server fans within the rack to provide airflow through the unit, pushing hot air through the door-based coil that cools the air and returns it to the room at close to the same temperature as the air entering the rack.

These units can cool higher densities than air cooling – up to 35kW per rack – and eliminate the need to place CRACs (computer room air conditioners) around the perimeter of the room, making more room for cabinets. It allows users to can run the data hall at a warmer temperature, in this case just below 80 degrees.

“If you’re used to your data center being a meat locker, it’s odd to be in a new experience,” said Holzknecht. “In our facility, there’s no differential between the hot and cold aisle.”

Colovore executives Peter Harrison (left) and Sean Holzknecht inside the company’s data center in Santa Clara. (Photo: Rich Miller)

Instead of cold air, the area under the raised floor houses cooling pipes, which connect with a heat exchanger and cooling tower in the equipment yard. Industrial-strength hoses connect the water lines to the cooling door, which is made by Liebert (Emerson Network Power). The supply lines are integrated into the door hinges for protection. Valves are closed for cabinets that are not in use.

The cooling door seemed bound for great things after it prevailed in the “Chill Off,” the groundbreaking 2008 competition between cooling vendors, where it was judged to be the most efficient approach to cooling a 10kW rack. But the design has since been adopted sparingly as the data center industry shifted focus to free cooling using fresh air. Colovore believes the cooling door is an idea whose time has come.

“It’s an incredibly efficient system, and you can turn it on one cabinet at a time,” said Holzknecht. “Because we are so dense, we can put the same amount of gear in fewer cabinets. It’s less expensive to build densely. It’s cheaper for us as a provider as well. Building denser is always less expensive.”

“The revenue per cabinet is so much more than the additional cost to build per cabinet,” said Peter Harrison, the CTO of Colovore and a co-founder along with Holzknecht and CFO Ben Coughlin. “I think we’ll start seeing more colo providers using this technology. We’re definitely early adopters. The acceptance will be accelerating.”

Colovore’s interest in density was tied to its original plan to create multi-use facilities in urban areas that combined high-density data centers with co-working space. That changed when the Colovore team discovered the Santa Clara property, which had 9 megawatts of power capacity ready to go and ease of expansion through an adjacent substation for Silicon Valley Power.

Differentiator in a Competitive Market?

In the crowded Silicon Valley market, designing for density turned out to be a differentiator for Colovore.  “You’ve got a lot of good-looking data centers that can only handle a little density,” said Holzknecht. “Everyone’s migrating out of these telcos. A lot of their facilities are 20 years old. A lot of them are realizing they have to retrofit, and they don’t have the appetite for the investment.”

The ability to cool dense workloads has already won the company some deals. One cloud tenant needed to house a 750kW requirement. “We won the deal because we could do that (load) in 50 cabinets and just 2,500 square feet,” said Holzknecht.

Colovore’s facility was owned by Les Pelio, a veteran developer of data center properties in Silicon Valley, who initially positioned it as a “container colo” center to house data center modules. “The big thing (Pelio) did is that there’s 9 megawatts of power already here,” said Holzknecht. “We can build the next three phases at once if the demand calls for it. We were fortunate to find this place and work with an experienced data center developer.”

The Colovore equipment yard features a heat exchanger and cooling tower. (Photo: Rich Miller)

Colovore has built out the first 2 megawatts of data center space, which is rapidly filling, and will soon add another 2 megawatt data hall. As it continues to grow, Colovore can add two more 2.5 megawatt phases to reach its 9 megawatt capacity. Each time it adds a data hall, it will also install skid-mounted modular  power room equipment from Digital Realty.

Digital Realty is both a neighbor and partner. The Colovore facility on Space Park Drive is virtually surrounded by data centers owned by Digital Realty, which is also an investor in Colovore. Scott Peterson, the Chief Investment Office at Digital Realty, is a member of Colovore’s board.

“We’ve been friends with the (Digital Realty) management team forever,” said Holzknecht. “We’re kind of a petri dish for a company the size of Digital Realty. These guys are in every market we want to go into, so we may want to partner with them. They have properties that are not active facilities, and that’s one of the things we’re interested in. But first we’ve got to execute here in Santa Clara.”

Generational Shift Could Accelerate Density

An interesting wrinkle is that Colovore believes the presence of younger engineers in data center teams is changing attitudes about density and water cooling.

“It’s a generational thing in a lot of ways,” said Holzknecht. “Our selling cycle is about engineers. We’re seeing a lot of 200KW to 1 MW deals, and internal infrastructure is often the requirement. At some of these companies, everything is virtualized and cloudified. Instead of having lazy servers being underutilized, they have grids.”[clickToTweet tweet=”Colovore believes younger engineers are driving changing attitudes about water cooling.” quote=”Colovore believes younger engineers are driving changing attitudes about water cooling.”]

Harrison, who worked at Google before teaming with Holzknecht and CFO Ben Coughlin to found Colovore, agrees that the company has done particularly well with “cloud native” companies that are comfortable with DevOps principles.

“A lot of them grew up with PC gaming and water cooling right in their living room,” said Harrison. “A lot of the younger companies are buying new equipment, and new equipment needs density. (Density is) a big deal for many of these customers.”

Does the adoption of cooling doors portend a change in density? At Data Center Frontier we will be tracking the latest trends in cooling and data center design. You can follow us on Twitter and Facebook or get our news by email using the form below.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Anggalih Prasetya/Shutterstock.com
Source: Anggalih Prasetya/Shutterstock.com

AI in the Data Center: Building Partnerships for Success

Wesco’s Alan Farrimond explains how the right partnerships can help data centers overcome the barriers to growth and meet the demands of AI.

White Papers

Get the full report

The Data Center Human Element: Designing for Observability, Resiliency and Better Operations

March 31, 2022
To meet the new demands being placed on data centers, industry leaders must rethink the way they approach their environment, delivery model and how they can leverage the cloud...