The Eight Trends That Will Shape the Data Center Industry in 2020

Jan. 6, 2020
What’s ahead for the data center industry? We’ve identified eight themes that will shape the data center business in 2020. The big theme: data, data everywhere.

What lies ahead for the data center industry in 2020?  At Data Center Frontier our eyes are on the horizon, and we’re constantly talking with industry thought leaders to get their take on key trends. Our crystal ball did pretty well last year, so it’s time to look ahead at what’s in store for 2020.

We’ve identified eight themes that will shape the data center business this year. We’ll be writing in more depth about many of these trends in coming weeks, but this list provides a high-level view of the topics that we believe will be relevant this year.

1. The Data Tonnage Challenge Gets Real

In 2020 the explosive growth of data will be felt like never before. We believe this is a sign of things to come, as next-generation technologies transform how we store, manage and move data.

The data center will drive this disruption, and be shaped by it. Machine-to-machine (M2M) technologies will generate enormous volumes of data, which will be expensive to move. Data tonnage creates challenges in both the distribution and concentration of data. As datasets grow larger, they are testing the network requirements for analytics, AI and other data-intensive applications.

“By 2022, 70 percent of data will be created outside the data center or cloud, up from 40 percent today,” said David Cappuccio, Distinguished Analyst at Gartner. “How do we build a network to move that around?”

The answer is two-fold: Bigger and faster networks, along with distributed compute capacity to perform “data thinning” before sending business-critical datasets across the network.

At the core of the network, data gravity will create ever-larger concentrations of compute and storage resources – which will mean business growth at major data center hubs.

“In 2020, we’ll see enterprises tackle data gravity by bringing their applications closer to data sources rather than transporting resources to a central location,” said Chris Sharp, CTO of Digital Realty, one of the largest data center providers. “By localizing data traffic, analytics and management, enterprises will more effectively control their data and scale digital business.

“As the data gets heavier, and denser, it gets harder to move,” said Sharp. “There are data lakes. Now there’s the data ocean.”

It also creates the potential for new extensions of cloud campuses, like CyrusOne’s plan to build a multi-tenant project near a Google data center cluster in Council Bluffs, Iowa.

“We are seeing the black hole and the data gravity effect of the major hubs,” said Jonathan Schildkraut, EVP and Chief Strategy Officer of CyrusOne. “Moving massive datasets around is very difficult. Datasets are going to get so large that they will need to be located next to the compute.”

2. The AI Arms Race Alters the Compute Equation

Artificial intelligence (AI) plays a starring role in this data tsunami. AI is a hardware-intensive computing technology that will analyze data both near and far. That includes everything from algorithm training at cloud campuses to inference engines running on smartphones.

AI can make products and services smarter. Every business yearns for that, which is why AI is emerging as a strategic priority.

“Machine learning is now table stakes for every tech company, large and small,” writes Fred Wilson, a partner at Union Square Ventures. “Using sophisticated machine learning models to personalize and improve your product is not a nice to have. It is a must have.”

A cluster of Cerebras CS-1 systems which pack 400.000 cores (and up to 20 kW of power) into 15 rack units, (Photo: Cerebras)

That’s driving a hardware arms race, featuring more innovation than the chip sector has seen in years. Intel says AI is creating an “insatiable “demand for faster, more power-efficient computing hardware. It will be a busy year for Intel, and fellow incumbents NVIDIA and AMD.

in 2020 they’ll be joined by a cluster of AI hardware startups bringing their products to market. An early example is Cerebras Systems, which just debuted a system packing 400,000 compute cores into a 15U rackmount chassis.

Another startup called Groq says its new chipset is capable of 1 PetaOp/s performance on a single implementation – equivalent to one quadrillion operations per second.

These eye-popping specs have implications for the data center, including much higher rack densities and more liquid cooling. The Cerebras CS-1 system, which can require up to 20 kW of power (which at a 15U form factor implies a rack density of 60kW). It is liquid-cooled, and we can expect to see more liquid-to-the-chip and immersion solutions be deployed to cool this new AI gear.

3. On-Site Power Generation and Climate Risk

Climate change is creating new challenges for the cost and availability of power. The data center industry will have to reckon with this trend in coming years, and we’re already seeing its impact.

As the 7×24 Exchange convened its fall conference in October, PG&E imposed rotating blackouts across California to de-energize its transmission lines in fire-prone areas. On the conference stage in Phoenix, executives from Equinix and Bloom Energy outlined the implications for the data center industry, predicting that on-site power generation will become more important.

“Having control over power generation locally is something we all have to think about,” said Craig Pennington, the Vice President of Global Design for Equinix. “Looking out at the news today, you can see why we should all be considering this. This is going to be an issue for a decade, and cost billions of dollars to fix.”

Pennington said Equinix anticipates “a significant increase in transmission costs (in California). I think there can be no doubt that’s going to happen.”

In 2019 the impact of climate change on the bottom line became more visible. The Wall Street Journal called PG&E “the first major corporate casualty of climate change,” as the utility filed for Chapter 11 bankruptcy as costs mounted from its role in the state’s deadly wildfires. Climate risk, along with the growing complexity of global energy delivery, will prompt more data center operators to integrate on-site generation or pursue other strategies beyond traditional utility power.

These Bloom Energy Server fuel cells provide power to the Equinix SV 5 data center in San Jose. (Photo: Rich Miller)

Equinix has been deploying Bloom Energy fuel cells in high-cost energy markets like California, New York and Massachusetts. The collaboration began in 2015 in PG&E territory in San Jose. Equinix projects that its footprint of 43 megawatts of  fuel cells powered by natural gas will save the company $150 million over the next 15 years, with much of the savings being realized in California.

Cost isn’t the only factor that will prompt data center operators to consider on-site power. Availability and time-to-market are also issues. “In Ireland, there is insufficient available power (in some locations),” Pennington noted. “In other countries in Europe, the time to deliver power can be 3 to 4 years. We don’t want to be delaying our projects that long.”

4. Infrastructure Funds Target Green Power

Wave power. Carbon capture. Solar and wind power at scale. These may all play a larger role in how data centers are powered in the future, as infrastructure funds bring new resources to deploying IT capacity at scale. That capability, along with a growing mandate for corporations to shift to greener energy footprint, lays the groundwork for new approaches to data center power.

“We have the option to do things in new ways,” said Phill Lawson-Shanks of Aligned Energy, which is backed by Macquarie Infrastructure Partners, the giant global infrastructure fund.

“Macquarie is very keen to support our environmental goals,” said Lawson-Shanks. “They have assets which use waste-to-energy, as well as carbon capture. We’re looking at how we can associate one of these plants with a data center build-to-suit.”

A similar strategy is in the works at Cologix. Its owner, Stonepeak Infrastructure Partners, is a big investor in renewable energy projects, says Bill Fathers, CEO of Cologix.

“We want to partner with these renewable energy companies to to build data centers in proximity to those energy facilities,” said Fathers. “We’re looking at current energy use across our portfolio, and will pivot to renewable energy where it is available. We are pursuing this opportunity across a broad set of options.”

At DCF we’ve been tracking how infrastructure funds are driving the industrialization of data center delivery. Renewable energy looms as a key opportunity, as infrastructure funds have the deep pockets and relationships to tackle the challenging economics of deploying data center renewables at scale.

5. Data Center Districts Take Shape

Data center districts will emerge in major markets, driven by both market forces and public policy.

As data tonnage reinforces the importance of proximity, we’ll see more “knock-down” expansion projects, as developers buy adjacent properties and demolish existing buildings to create new data center campuses. This has been going on for years in Santa Clara, and is now happening in Ashburn as well.  In major cloud corridors, economics will make data centers the most valuable use for real estate.

Meanwhile, local officials will seek to create data center corridors, concentrating digital infrastructure in locations that support future growth. This trend is driven by the growing awareness of the economic benefits of data centers, as well as the downsides of having a data center as your next-door neighbor.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Data centers are becoming part of the real estate landscape in many communities. DCF has been closely tracking this phenomenon in Loudoun County in Northern Virginia, where cloud clusters provide a massive economic boost. Direct tax revenue from data centers in Loudoun County will reach $320 million this fiscal year.

That’s why the county has launched a branding initiative to raise the profile of its Internet industry, with improved signage identifying the Ashburn data center district. This is a big change for an industry that once prioritized “security by obscurity.”

Loudoun County has also grappled with residents’ desire for better-looking data centers, while other locales have had public controversies about noise from cooling infrastructure. Another concern is  power capacity, which is a factor in a temporary data center moratorium in Amsterdam.

Data center districts provide an efficient focal point for infrastructure, and will ease tensions with the communities where they are located. Expect to see more of them.

6. Cloud, Cloud, Cloud and Connectivity

No, I didn’t forget enterprise cloud computing. You no longer need a crystal ball to predict that enterprise cloud will continue to reshape the IT landscape. But if you need a reminder, here it is: The ongoing IT migration from on-premises data centers to colo and cloud providers is a multi-year transition. It’s a huge business, and is still gaining momentum. We’ll see a continuation of the rational reallocation of enterprise IT resources continue across cloud platforms, colocation facilities and on-premises data centers in 2020.

The same holds true for cloud connectivity. It’s not an accident that our top trends – data tonnage and AI compute – will drive demand for better networks. That’s why hyperscale companies are deploying their own fiber and subsea cables, and SDN-powered cloud connectivity is hot. More of the same in 2020.

7. Edge Focus Shifts to Economics, Decoupling 5G Hype

The data center industry remains surprisingly divided about the prospects for edge computing. The split is not about whether edge computing will be useful, but  whether it will be profitable.

That’s why the economics of edge computing will come into sharp relief in 2020. End users and investors will focus on near-term cost/benefit analyses rather than long-term potential. Edge computing is a trend that will play out over many years, and has been boosted by enthusiasm over technologies with long deployment horizons.

This timeline will be the backstory behind many of the headlines in edge computing this year. We believe there will be successes and disappointments alike in 2020, and both the “edge boom” and “edge hype” scenarios will drive an active M&A landscape

At DCF we continue to see a large opportunity for edge data centers, a view that is reinforced by many veterans of the data center industry.

“2020 will be a milestone year for edge computing now that so much of the foundational work has quietly been done behind the scenes over the past year,” said Chris Crosby, CEO of Compass Datacenters. “One of the key things to watch for will be the first commercial edge deployments, which will signify we’ve moved into a new phase in the maturation of the industry.”

Alan Boehme, the Global Chief Technology Officer for Procter & Gamble, speaks at Edge Computing World 2019 in Mountain View, CA. (Photo: Rich Miller)

Significantly, large enterprise users are affirming their readiness to invest in edge strategies, establishing a market beyond the telcos and content players.

“Edge computing will reduce our costs because we’re moving less data,” said Alan Boehme, CTO of Procter & Gamble. “The amount of data we deal with continues to grow, and will keep getting larger.”

Some industry watchers say it’s important to separate edge computing from the hype surrounding 5G wireless. This is especially true for the Internet of Things, where many use cases don’t require real-time low-latency connectivity.

“Factory automation won’t be tremendously juiced by 5G,” said Michael Dolbec, Senior Managing Director at GE Ventures, who focuses on industrial IoT. “It’s not being held back by 5G.”

8. Remote Management and Unstaffed Data Centers

There are more servers than people to manage them. This challenge has been addressed primarily through automation, which has enabled a single sysadmin to manage tens of thousands of servers. The server-to-admin ratio is becoming problematic. As server growth continues to accelerate, staff development is not keeping pace. A “grey tsunami” of retirements may arrive amid strong growth for edge computing, which will mean data centers in more locations.

That’s why the push to create and manage unstaffed data centers will come to the fore in 2020. This effort will be led by software, including offerings from DCIM specialists, equipment vendors such as Schneider Electric and Vertiv, and edge specialists like EdgeConneX and Vapor IO.

But this effort will also feature new approaches to design and even robotics. We’ve seen the early outline of this in TMGcore’s creation of robotics technology that can “hot swap” servers, removing a failed server from the immersion bath and replacing it with a fresh server. Another cooling company, Submer is looking at similar concepts.

“With HPC and edge, there is an expectation that we’re nearing an inflection point for change,” said Scott Noteboom, the CTO of Submer. “We’ve begun to hit the wall in terms of machines working well in a human environment. It’s time for version 2 of the data center, which is an environment optimized for machines.”

What Else We’re Watching

Here are a few other trends that were contenders for the list, and we’ll be watching closely n 2020:

Water Conservation: This is another climate-driven megatrend. Water will become a more valuable commodity in coming years, and data centers’ water use will draw greater scrutiny. This will drive new thinking in  data center design and site selection.

Diversity in the Data Center:  The industry will continue to grapple with long-running inequities in opportunities for women and persons of color. it’s long overdue, and will be accelerated by staffing challenges in key sectors. There’s been welcome progress on this issue in recent years, but much remains to be done. This important work will continue in earnest in 2020.

Want to stay on top of these trends? Follow us on Twitter and Facebook, connect with me on LinkedIn, and sign up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Image courtesy of Submer
Image courtesy of Submer

The Rebound Effect in the Data Center Industry: How to Break the Cycle

Nathalie Cruchet, Head of Sustainability & ESG at Submer, explains why tackling the rebound effect is essential for achieving true sustainability.

White Papers

Thumbnail2

Choosing the Right Technology for Diesel Backup Generators

July 26, 2023
Environmental and long-term sustainability concerns are increasingly influencing our technology decisions, and that’s driving change in the market. Gone are the days of simple...