Executive Roundtable: Data Centers Power the Digital Transformation
Welcome to our 10th Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In our First Quarter 2018 roundtable, we will examine four topics: The explosion of data and its implications for the data center sector, the rise of artificial intelligence in data center management, the state of server rack power density, and key trends in adoption of renewable power and energy storage.
Here’s a look at our distinguished panel:
- Jeff Klaus, General Manager of Intel Data Center Software Solutions, which provides real-time data on power and thermal conditions for a wide range of data center equipment.
- Erich Sanchack, EVP of Operations at Digital Realty, responsible for overseeing global portfolio operations, global construction, colocation and interconnection service implementation.
- Jack Pouchet, Vice President Market Development at Vertiv, who works to architect and create opportunities for advanced power and cooling technologies that improve day-to-day operational efficiencies.
- Dennis VanLith, Senior Director Global Product Management at Chatsworth Products (CPI), one of the founders of CPI, who has held several positions in the company during the past 32 years.
The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier. Each day this week we will present a Q&A with these executives on one of our key topics. We begin our discussion by asking our panel about explosion of data and its implications for the data center sector.
Data Center Frontier: The digital transformation of our society is creating enormous volumes of data, even ahead of the mainstream adoption of next-generation technologies like 5G wireless, the Internet of Things, augmented reality and autonomous vehicles. As this deluge of data looms on the horizon, what are the implications for the data center industry, and how and where all this data is stored and analyzed?
Dennis VanLith: As digital transformation unfolds, the impact to the data center will be significant. According to the latest projection of storage accumulation, worldwide storage is expected to increase 800 percent from 2018 to 2025 (from 20 Zettabytes to over 160 Zettabytes), and both the edge (data centers located close to the user to minimize latency and improve quality of service), and the core data center will own over 65 percent of all data, with the remaining data on personal endpoint devices.
This implies a significant continued need in increased storage capacity and throughput at the data center as well as the edge. Further, as we see the rollout of 5G across the industry, and with increased usage of augmented reality, artificial intelligence and Deep Neural Networks, we expect an even further emphasis on data being stored close to the endpoint.
This will foster new usage models, such as connected vehicles that allow for real-time optimization of traffic flow or live updates of road hazard conditions. This will also transform the behavior of the workforce, allowing people to work where they want and how they want, free from wires or the limitations of their endpoint device.[clickToTweet tweet=”Dennis VanLith: (Data) will transform the workforce, allowing people to work where they want & how they want, free from wires or the limitations of their endpoint device.” quote=”Dennis VanLith: (Data) will transform the workforce, allowing people to work where they want & how they want, free from wires or the limitations of their endpoint device.”]
Jeff Klaus: I think the virtuous cycle of data and technology advancements will lead to some of the most fantastic innovations (in the near term). One of the most important of these is the maturity of AI and its usage in parsing and connecting trending information from disperse data streams.
This is the area where the data center has not experienced the complete picture of how to utilize even basic machine learning algorithms to optimize process or drive predictive maintenance effectively. As DCIM and other mainstream tools move into SaaS solutions, they will refine these capabilities and probably try to educate the consumer as to the fine usages and results.
Erich Sanchack: The insights that enterprises can derive from the data they are gathering from sensors, web systems and other sources can be incredibly valuable. New tools are moving access to that data and those insights well beyond traditional BI and data analytics teams to business managers across disciplines, giving them drag-and-drop analysis and collaboration capabilities and truly democratizing the discipline. With ease of use comes expansion, of the systems, the data and the transport necessary to facilitate it.
Data center providers like us have a number of opportunities, and a number of responsibilities, relative to this. It’s incumbent on us to really understand the systems that are generating the data and the uses the data is being put to, so that we can truly act as a partner to our enterprise customers.[clickToTweet tweet=”Erich Sanchack: It’s incumbent on data center providers to understand the systems that are generating the data.” quote=”Erich Sanchack: It’s incumbent on data center providers to understand the systems that are generating the data.”]
It’s also obviously our responsibility to ensure that our facilities are prepared to support the massive amounts of compute and storage our customers require to take maximum advantage of the opportunities they have related to the data. Last, we need to ensure maximum uptime, so these systems are always available to the growing numbers and types of workers that interact with it.
Jack Pouchet: The impending deployment of 5G combined with the rapid growth of IoT and mobile technologies – not to mention smart vehicles – combine to place inordinate demands on our core data centers, cloud and hyperscale data centers, and the network infrastructure bringing them together. These and other technologies (virtual reality, on-demand anywhere) will accelerate the demand for compute, storage and analytics at or near the edge.
Several architectures are likely to emerge based upon the volume and nature of the data and the application (life critical, video, social, business, etc.). Those models will ensure data is captured and acted upon at the right point in the system to deliver the value, service, and SLAs required for the application. Of course data privacy and cybersecurity issues will need to be addressed.
NEXT: The rise of artificial intelligence in data center management.
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below: