The Data Center Frontier Executive Roundtable features insights from industry executives with lengthy experience in the data center industry. Here’s a look at the insights from Jeff Klaus of Intel.
As General Manager of General Manager of Intel Data Center Software Solutions, Jeff Klaus leads a global team that designs, builds, sells and supports Intel DCM, the only software that provides real-time, server-level data on the power and thermal conditions across a wide range of data center servers and other equipment. Provided as an SDK, Intel DCM middleware is integrated into Data Center Infrastructure Management (DCIM) consoles to increase data center power and thermal efficiency.
Since joining Intel in 2000, Klaus’ accomplishments have been recognized by multiple division awards. With a broad background in software solutions for the channel, client and SMB space, he has served as Director of Media Programs within Intel’s Digital Home Group, Entertainment Content Marketing Manager, Business Operations Manager, and Software Marketing Manager.
An accomplished speaker, Klaus has presented at such industry forums as Gartner Data Center Summit, AFCOM’s Data Center World, the Green IT Symposium, and the Green Gov conference. He has authored articles on data center power management in Data Center Post, IT Business Edge, Data Center Knowledge, Information Management and Data Centre Management. Klaus currently serves on the Board of Directors for the Green IT Council. Klaus earned his BS in Finance at Boston College and his MBA in Marketing at Boston University.
Here’s the full text of Jeff Klaus’ insights from our Executive Roundtable:
Data Center Frontier: The digital transformation of our society is creating enormous volumes of data, even ahead of the mainstream adoption of next-generation technologies like 5G wireless, the Internet of Things, augmented reality and autonomous vehicles. As this deluge of data looms on the horizon, what are the implications for the data center industry, and how and where all this data is stored and analyzed?
Jeff Klaus: I think the virtuous cycle of data and technology advancements will lead to some of the most fantastic innovations (in the near term). One of the most important of these, is the maturity of AI and its usage in parsing and connecting trending information from disperse data streams.
This is the area where the data center has not experienced the complete picture of how to utilize even basic machine learning algorithms to optimize process or drive predictive maintenance effectively. As DCIM and other mainstream tools move into SaaS solutions, they will refine these capabilities and probably try to educate the consumer as to the fine usages and results.
Data Center Frontier: We’ve recently featured headlines about the adoption of artificial intelligence (AI) as a tool in data center management. How do you view the potential for AI to help optimize and automate data centers, and what are the pros and cons of this technology?
Jeff Klaus: As I stated above AI is tremendous but it is definitely an advisory tool – meaning that as it stands today, the ability to refine and clarify meaningful associated data from all the noise still requires some knowledgeable individuals looking at the results from the AI engine. What AI offers is speed and continual consumption of the data. What it does not offer is context related to specific associations, and experience to intuit when a constructed model, or the prescribed action is not appropriate.
Data Center Frontier: For some time we have seen predictions that rack power density would begin to increase, prompting wider adoption of liquid cooling and other advanced cooling techniques. What’s the state of rack density in 2018, and is density trending higher at all?
Jeff Klaus: Increased density is always desired, but it has a lot of headwinds in the form of physical constraints. These constraints in existing power and cooling infrastructure are hard to overcome. Also, it’s still only appealing to only a subset of tenants because liquid cooling usually requires more prescribed rack configurations.
Finding ways to optimize the hardware in the racks through real-time monitoring of power, thermals, and implementing power control at the server can drive density just as much as other more costly techniques. This ties into the AI data streams and identifying actionable data on usage and capacity as well.
Data Center Frontier: Data center companies are some of the largest consumers of renewable energy. Are these initiatives by large data center operators making clean energy more available and affordable? Will energy storage become part of the solution anytime soon?
Jeff Klaus: I think there are a lot of discussions about how to use captive power to offset peak consumption, but it still needs better instrumentation and control to help shape workload and power consumption where the tradeoffs show ROI.
To get to that mode of operation as a more broadly accepted practice, we’ll need more successful use cases that show how to retrofit data centers, leverage environmental and power data feeds, and machine leaning algorithms to assess when the model shows ROI.
Check out our entire Data Center Frontier Executive Roundtable for 1Q 2018.