Welcome to our 18th Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. Our First Quarter 2020 roundtable offers insights on four topics: the explosive growth of data and its impact on the enterprise, the pace of the rollout of 5G wireless, how the rise of AI is influencing liquid cooling, and the data center industry’s progress on diversity.
Here’s a look at our distinguished panel:
- Chris Sharp is Executive VP and Chief Technical Officer at Digital Realty.
- Gary Niederpruem is chief strategy and development officer at for Vertiv.
- Kristen Kroll-Moen, Senior Director of Global Marketing for Chatsworth Products.
- Amber Caramella is the Chief Revenue Officer at Netrality Data Centers and a member of the Advisory Council of Infrastructure Masons.
- Michael DeVito, the SVP, Global Sales & Marketing for Iron Mountain Data Centers.
- Jeff Klaus, General Manager of Intel Data Center Software Solutions.
The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier. Each day this week we will present a Q&A with these executives on one of our key topics. We begin with a look at our panel’s predictions for the data center industry for 2020.
Data Center Frontier: As digital transformation gains momentum, enterprises are managing more data in more places. How will the explosive growth of data (“data gravity”) impact the growth and geography of digital infrastructure in 2020?
Chris Sharp: In 2020 and for the next decade really, we’ll see major cities and metros like New York, Tokyo, San Francisco, Singapore, and London become the world’s leading digital capitals. In fact, we recently reported that AI, IoT, 5G, and blockchain will further increase the explosion of data in these major areas, with the data generated expected to increase 60% annually in the next five years. The result is a growing IT imperative to distribute applications closer to this data by creating new centers of data in major metros around the world, creating the data gravity effect.
The accelerating speed of technological innovation is putting immense pressure on organizations and their IT infrastructure. As pointed out by 451’s enterprise IT research, 70% of enterprises plan to expand geographically within the next two years, creating urgency to modernize IT infrastructure to become distributed and decentralized. Companies must support their digital transformation strategies by maintaining centers of data where they do business, rather than working against data gravity and inefficiently transporting data to a central location. Gartner’s recent report predicts that by 2022, 60% of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers.
With PlatformDIGITAL, we at Digital Realty are uniquely able to enable our customers to deploy their IT infrastructure at the centers of data exchange around the world, bringing users, things, applications, clouds, and networks to the data. We’re committed to providing fit-for-purpose infrastructure in 2020 and beyond, so that our customers can power their digital transformation at the scale and speed they require, no matter where they are located.
Michael DeVito: Data is geographically agnostic. It doesn’t care where it’s produced or lives. The people using that data, however, are a different story. These days, we expect things quickly. We want our packages delivered in two hours, our movies streamed without interruption and our dinner delivered within minutes.
Growing right alongside that user demand for immediacy is the explosive growth of data. With IoT, more devices are broadcasting telemetry and other data, requiring more storage and more processing power. Both users and devices are creating expectations that data should be available anytime, anywhere.
Data growth is occurring over a broader geographic area – and it needs to be delivered quickly. Low latency is paramount. This has created more demand for geographically dispersed data centers, and the growth of edge deployment. Edge computing will continue to grow as this demand increases.
Amber Caramella: There’s good news and bad news. The explosion of digital transformation will mean that digital infrastructure must continue to grow, particularly in the enterprise space. Hybrid IT environments are becoming the new normal, with many enterprises managing workloads in data centers, in the public cloud and at the edge. Data is also becoming more and more distributed. A business in the United States that hosts its website on the public cloud might also store customer information and backups across private cloud infrastructures in other countries. Cloud service providers will require more data center space to host their systems, necessitating the expansion of the colocation market.
However, infrastructure capabilities are not currently keeping pace with the business needs of digital transformation. The pace of change will only increase in 2020, as enterprises face continuous pressure to more rapidly create, deploy, manage and govern dynamic application environments. Infrastructure development, meanwhile, is hindered by a number of factors. The number and degree of infrastructure challenges varies by region, but the lack of adequate infrastructure to accommodate digital transformation is a global issue. There are several examples of companies strategically partnering together to address the problem with creative go-to-market solutions to improve the digital infrastructure landscape with new technologies and infrastructure in rural markets.
Kristen Kroll-Moen: Digital transformation is pushing a much larger influx of data into the network, and this data needs to be processed instantly to enable new demands – be it in artificial intelligence, machine-to-machine learning or real-time analytics.
In this scenario, the key decisions are what data to keep, where to process it, whether to store it and for how long. In this reality, organizations must be capable of deploying their IT strategy through hybrid infrastructure and have a plan for what type of data should be processed and where it should be stored – enterprise data centers, cloud, edge locations or on-premise locations.
Gary Niederpruem: As digitalization continues to advance, the availability of digital infrastructure close to where data is being generated and consumed will enable new applications that use that data to create richer user experiences and improve business productivity and efficiency. This is the driver behind the growth in edge computing we are seeing today.
When you analyze current and emerging edge use cases, as we did at Vertiv, commonalities emerge in the areas of data management, latency and availability. Those commonalities enabled us to define edge archetypes that encompass multiple use cases. For example, while distributing HD video and supporting smart manufacturing are very different use cases, they share the common challenge of managing high volumes of data. Similarly, smart retail and augmented reality share the challenge of optimizing latency for human use of technology. As enterprises refine their strategies to capitalize on the opportunities digitalization represents, we will see compute and storage continuing to go where the data is through edge deployments.
Jeff Klaus: Most organizations dealing with large repositories of data deploy some form of data lake that then gets consumed by analytic tools. The data lake tends to reside where security, cost, and data analytic access offers the best blend of insights to overhead. The challenge in data gravity becomes real as analytic tools increase and the amount of processed data (originally in the data lake) increases, which drives the company to search for storage optimization strategies (on top of what they have in the Data Lake).
It’s very important that IT professionals understand this issue, and actively manage the number of analytic tools used in the environment. If left unmanaged, these tools will grow in number and size, overwhelming the capacity of people and resources. Following sound cost benefit analysis should help organizations avoid too much bloat from tools and noise from unfocused data.
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below: