• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Cooling / As Power Densities Rise, Providers Lead With Cooling

As Power Densities Rise, Providers Lead With Cooling

By Rich Miller - April 14, 2017

As Power Densities Rise, Providers Lead With Cooling

High-density racks inside the Colovore data center in Santa Clara, Calif. (Photo: Rich Miller)

LinkedinTwitterFacebookSubscribe
Mail

SANTA CLARA, Calif. – If the data center industry had a Facebook profile, its relationship with density would be …. “It’s complicated.”

New hardware for cloud computing and machine learning is bringing beefier workloads into the data center. Meanwhile, overall rack densities are trending slightly higher, but not at the rate all this new data-crunching might suggest.

The disconnect, according to Ben Coughlin, is created by the industry’s conservative approach to managing space and compute power. Coughlin, the co-founder and Chief Financial Officer of Colovore, says many data center managers are spreading out rather than densifying – breaking up workloads across more cabinets rather than trying to manage fewer racks that are fully loaded with IT equipment.

“If you compressed the IT load into those empty spaces (in cabinets), you’d find much higher rack densities,” said Coughlin.

Focus on ‘Footprint Efficiency’

Colovore is seeking to address this problem of “footprint efficiency” with a Santa Clara colocation facility designed to support higher rack densities. “We came to market with the idea of ‘what if people could fill their racks,’ ” said Colovore President and co-founder Sean Holzknecht. “Our goal is to build the density that allows people to fill their racks again. People say ‘the density thing didn’t happen.’ In a lot of cases, it’s because people aren’t filling their racks to the top.”

The Colovore team says the growth of cloud computing and artificial intelligence (AI) is bringing more high-density workloads into data centers, and customers are more willing to put more equipment into their cabinets and seek out facilities that can cool them. This issue is coming to the fore in markets with a limited supply of data center space, like Santa Clara, placing a premium on getting the most mileage out of every rack.

“We see plenty of opportunity,” said Coughlin, who said Colovore has filled its first two megawatts of capacity and is about to bring another two megawatts online. The first phase was designed to handle power densities of 20 kilowatts (kW) per cabinet, but the new space will be engineered for 35 kW per cabinet.

“We’ve been pushed on this,” he said. “Some of the highest performing clients are pushing higher densities.”

Most of these customers aren’t pushing Colovore to its limits. Rack densities average 11 kW at present, with some clients hitting 18 kW. Coughlin expects that average to rise to 13 kW in the next phase, and some customer racks my approach 30 kW or more.

Cloud, AI Changing the Game

Over the past decade, there have been numerous predictions of the imminent arrival of higher rack power densities. Yet extreme densities remain limited, primarily seen in HPC. The consensus view is that most data centers average 3kW to 6kW a rack, with hyperscale facilities running at about 10kW per rack.

Free Resource from Data Center Frontier White Paper Library

Cooling
Sustainably Meeting High Density Cooling Challenges
As the power densities of IT equipment have increased over the past twenty years, data center designers and operators face a number of challenges when it comes to managing this intensified heat load. A new special report from Data Center Frontier and Nautilus Data Technologies explores issues and potential solutions to supporting high density cooling in an efficient and sustainable way.
We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Get this PDF emailed to you.

That’s changing with the emergence of power-intensive hardware like graphics processing units (GPUs) and other specialized chips for AI and machine learning workloads.

Colovore is among a small but growing number of multi-tenant data center providers that are optimizing for high density customers. “As the industry adapts to these designs, customers will use it,” said Peter Harrison, Colovore’s Chief Technical Officer. “The customers don’t value engineering because the (data center) industry isn’t adapting as fast as they are.”

Colovore’s Santa Clara data center uses racks equipped with water-chilled rear-door cooling units from Vertiv (previously Emerson Network Power). LinkedIn is using rear-door cooling units from Motivair for its new facility near Portland, while newcomer Nautilus Data Technologies is planning to offer  racks with rear-door heat exchangers from ColdLogik.

Colo Providers Lead With Containment

Several large colocation providers have been working the high-density space for years. These include the Switch SUPERNAPs in Las Vegas (and now Reno and Michigan as well) and CyrusOne, which built its business by hosting seismic exploration data processing for oil and gas firms in the Houston area.

Another colocation company that sees an opportunity in high-density workloads is ViaWest, which operates 29 data centers across the Western U.S. and Canada.

“There are applications that require this kind of high-density cooling, and we know that densities are going to increase,” said Dave Leonard, the Chief Data Center Officer at ViaWest. “We especially see the emerging use of hyper-converged computing architectures, which are capable of extremely dense deployments. We want to provide maximum flexibility for our clients so they can take any path in computing infrastructure that they want to.”

The primary high-density strategy for most multi-tenant providers is containment, which creates a physical separation between cold air and hot air in the data hall. ViaWest is taking a slightly different approach that combines partial containment with instrumentation and monitoring.

The company created a test cage outfitted with load banks (equipment that simulates an electrical load for purposes of testing and commissioning) at its Plano, Texas data center to demonstrate its capabilities.

“The cold aisle air is managed in a few key ways,” said Leonard. “We have standard cabinet blanking in place to avoid hot and cold air mixing through the cabinets. We put a door on the cold aisle end to avoid hot/cold air mixing around the end of the cabinet rows, and we put a two-foot high bathtub (perimeter) around the top of the cold aisle to keep the cold air in the cold aisle, and not allow hot/cold air mixing across the top of the rows. We also used a six-foot cold aisle with floor tile grates to allow enough CFM of cold air into the cold aisle.”

ViaWest leaves the top of the cold aisle open, rather than enclosing it with ducting, as is the case with many high-density solutions. The company monitors air pressures and temperature at multiple points within the environment to ensure the equipment is being cooled properly.

The ViaWest approach brings higher density in each rack, but also uses more space than standard racks, as it features a six-foot wide cold aisle.

Cabinet-Level Cooling Solutions

Several high-density specialists are using custom cabinets, a group that includes San Diego provider ScaleMatrix, which effectively shrinks the data center into a single cabinet By containing and compartmentalizing workloads.

The ScaleMatrix Dynamic Density Control cabinet is an extra-wide enclosure with two compartments: a bottom compartment housing a four-post rack (with extra depth to accommodate HPC servers), and a top compartment containing a heat exchanger. Air is recirculated within the cabinet, with cool air delivered to server inlets via an 8-inch air plenum in the front of the rack. When exhaust heat exits the back of the equipment, it rises to the cooling compartment and is cooled and recirculated.

GPU hosting specialist CirreScale hosts some equipment with ScaleMatrix, but also has patented a design for a vertical cooling technology in which cold air from under the raised floor enters the bottom of the cabinet directly through a perforated tile. It then flows vertically through the cabinet and is exhausted into a ceiling plenum.

Immersion cooling

Servers immersed in a liquid cooling solution from Green Revolution Cooling. (Photo: Green Revolution)

Over the past year we’ve profiled several companies developing liquid or refrigerant cooling systems for ultra-high densities. The vast majority of data centers continue to cool IT equipment using air, while liquid cooling has been used primarily in high-performance computing (HPC). With the emergence of cloud computing and “big data,” more companies are facing data-crunching challenges that resemble those seen by the HPC sector, which could make liquid cooling relevant for a larger pool of data center operators.

Some recent examples:

  • Ebullient Cooling is cooling processors using Novec 7000, a liquid coolant from 3M that has proven popular in immersion cooling solutions for the bitcoin market. Instead of dunking servers in a bath, Ebullient is delivering the dielectric fluid directly to the processor, using a piping system to bring the liquid inside the server chassis.
  • Green Revolution Cooling submerges servers into a dielectric fluid similar to mineral oil. Servers are inserted vertically into slots in the tank, which is filled with 250 gallons of ElectroSafe fluid, which transfers heat almost as well as water but doesn’t conduct an electric charge.
  • The Aquarius server from Aquila Systems, a new server offering warm-water liquid cooling for hyperscale data centers using Open Compute designs. The Aquarius system uses a cooling design by Clustered Systems using water piping to cool a flat plate atop the components.

This all adds up to a growing universe of designs and options for data center operators grappling with the growth high-density workloads. Given the varying use cases and appetites for risk, the industry’s approach is likely to remain diverse and complicated for some time to come.

LinkedinTwitterFacebookSubscribe
Mail

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Overcoming Supply Chain Roadblocks: How to Avoid Disruptions in Your Data Center

Overcoming Supply Chain Roadblocks: How to Avoid Disruptions in Your Data Center The data center industry continues to experience significant global supply chain problems. Brett Williams of Service Express, explores the importance of leveraging the secondary hardware market to overcome supply chain roadblocks.

White Papers

Government Data Centers

Federal and State Government Data Centers: Balancing Modernization and Servicing Imperatives

The purpose of this new Vertiv white paper is to help federal and state agency decision makers, including data center operations, IT, facilities and desktop technology managers, and communications room operators, consider whether to service new and existing equipment. Get the full report today.

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

DCF Spotlight

Data center modules on display at the recent Edge Congress conference in Austin, Texas. (Photo: Rich Miller)

Edge Computing is Poised to Remake the Data Center Landscape

Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Pkaza Critical Facilities Recruiting

  • Critical Power Energy Manager - Data Center Development - Ashburn, VA
  • Site Development Manager - Data Center - Ashburn, VA
  • Data Center Facility Operations Director - Chicago, IL
  • Electrical Engineer - Senior - Dallas, TX
  • Mechanical Commissioning Engineer - Calgary, Alberta

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Data Center Frontier LLC © 2022