Today we conclude our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of experienced data center executives – Intel Software’s Jeff Klaus, Erich Sanchack from Digital Realty, Cyxtera’s Mitch Fonseca, and Eric Boonstra of Iron Mountain – share their takes on the future of data center automation.
The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.
Data Center Frontier: What lies ahead for data center automation? What are the key opportunities in using data (and perhaps even AI) to make data centers more efficient and productive?
Erich Sanchack: Automation is critical to addressing the growing workforce shortage we see in the industry, and most importantly to help guide our customers through the onboarding process in the most efficient and consistent way. We believe that embracing more automated processes will allow data center operators to concentrate more on higher-order, high impact tasks that require human oversight. Power management, service provisioning and predictive maintenance are three areas that are ripe for automation applications.
Automation tech like AI and machine learning can easily manage the cooling and power management requirements of the data center – optimizing based on the applications and tenants and improving energy efficiency. AI can help model future demands for power consumption, which will reduce energy costs and improve the overall system reliability. In this case, automation can give upwards of 5 percent gains in efficiency, having a big impact on the data center’s environmental footprint and profitability.
In predictive maintenance, machine learning can also play a big role. Once we have enough data on the facility available, machine learning can take the guesswork out of the maintenance operations, aligning on equipment types, power usage, performance and incident data and reporting back to prevent failures before they occur. This is a largely untapped use-case today, but can give us huge gains in productivity and uptime industry-wide, while reducing waste and the associated costs.
We also see an upward trend in the adoption of data center management platforms that automate management and access for customers. These types of portals allow customers to ‘self-serve’ their virtual connections, reducing the need for trained traffic engineering staff and delivering a more consistent experience across locations.
Eric Boonstra: We are transferring from data center to centers of data. Customer portals are maturing. Real time visibility on operational performance and auto provisioning are considered baseline. AI will be the key enabler of the move to more effective Data Center Infrastructure Management (DCIM). BMS and AI data will help to optimize cooling systems and efficiency in the data centers. AI will enable operators to optimize preventative maintenance and reduce the amount of infrastructure failures. It will enable operators to work with dynamic capacity forecasting models as well.
We have found that for most enterprises the current benefits of AI are peripheral rather than central to their business. Yet over the coming years, there will be a growing range of opportunities in the cloud-connected data center.
Sector-specific Machine Learning platforms will be revolutionary, and CIOs should be on top of the latest developments. The greatest areas of opportunity reside in the cloud data center. Collaborative data sharing and processing platforms and hubs, typically deploying third-party analytics, will deliver a range of smart services. Machine learning will enhance efficiency in the data center-based compute and connect infrastructure.
Jeff Klaus: That is a great question. AI is becoming a consultative source that helps operators zero in on issues, and I think that will continue to be its focus for the near future.
The challenge is the data ingestion and processing to maintain that capability is rather large, and I think it may have to become more distributed to really maximize efficiency.
For example: utilizing server and subcomponent OOB analysis and telemetry (from individual servers) can help identify areas of concern or underuse without need to centralize all the data into a data lake, analyzing the data, and finally building an action plan.
Mitch Fonseca: Automation plays an important role in a data center. At Cyxtera, we use it to optimize energy efficiency and improve operations staff productivity. As automation and machine learning are increasingly embraced, data center providers have the potential to leverage this technology to help avoid problems as we predict events before they occur.
RECAP: Get links to all our Executive Roundtable stories and transcripts of Executive Insights.
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below: