Today we continue our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of experienced data center executives – Jack Pouchet of Vertiv, Intel’s Jeff Klaus, Erich Sanchack of Digital Realty and Dennis VanLith of Chatsworth Products – discuss the growing use of artificial intelligence in data center management.
The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.
Data Center Frontier: We’ve recently featured headlines about the adoption of artificial intelligence (AI) as a tool in data center management. How do you view the potential for AI to help optimize and automate data centers, and what are the pros and cons of this technology?
Dennis VanLith: Eventually, AI will enable all aspects of the data center. The gold state would be data centers that flex to optimize all aspects of operations, with AI dynamically managing cooling, storage and networking to their optimum state. Computers are extremely good at calculating data and determining relationships.
In essence, deep learning can take lots of data and use it to determine what combination of relationships are optimal for a repeatable output. This is where we expect to see initial AI integration into the data center. Customers will take data from the data center – cooling, power, performance – and use deep learning to determine what combination of facility settings are optimized for their workloads.
Over time, we expect to see this level of automation (or AI) embedded real time in the data center. As with all aspects of the data center, security and redundancy will be critical, so we expect customers to place toes in the water and small incremental approaches versus jumping all-in. Eventually, the end state will be a DCIM system that is continually adjusted to optimize for the workload and the current environmental conditions.
Erich Sanchack: Implementation of AI in the data center will move us well beyond current DCIM systems and their limitations. Using AI, we are able to create an environment in which not only are all of our power and facilities decisions and processes completely optimized, but that our resource planning and even advanced functions like dynamic bandwidth and server allocation are fully automated as well.
From the enterprise perspective, management and reporting will change significantly, enabling them to become more focused on the outcomes and the benefits of data center operations than the process and the environment in which it resides.
Jack Pouchet: True artificial intelligence is a long way from reality, and the cons are all too easy to define. Just think HAL 9000 and Skynet. But we need not even look that far as there are numerous real-world examples almost on a weekly basis of automation going awry. A fine example is when semi-autonomous vehicles with “safety” features designed to automatically apply the brakes under certain conditions misinterpret data, fail to understand there is no danger, and cause accidents by braking unsafely.
On the “pro” side: Machine learning already is beginning to be adopted within advanced data centers as a means to improve the control of cooling systems and act as the vision, hearing, touch, and smell of a super-facilities-manager. These systems keep track of the known operating points and constantly look for the unknown or any deviation from the norm in an effort to optimize performance.
This type of system will emerge to improve the day-to-day performance of the data center while reducing the need for unnecessary preventive maintenance. The data center machine learning functions (or AI in the distant future) can identify potential faults far enough in advance for a technician to schedule a service visit at a time best suited to the operation of the data center.
Jeff Klaus: AI is tremendous but it is definitely an advisory tool – meaning that as it stands today, the ability to refine and clarify meaningful associated data from all the noise still requires some knowledgeable individuals looking at the results from the AI engine.
What AI offers is speed and continual consumption of the data. What it does not offer is context related to specific associations, and experience to intuit when a constructed model, or the prescribed action, is not appropriate.
NEXT: What’s happening with server rack power density?
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below: