Roundtable: IoT Sensors and Data Center Software Decisions

Nov. 14, 2017
Our Executive Roundtable panel ponders the state of data center instrumentation, and how it influences decisions on data center management software. (Photo: Google)

Today we continue our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of experienced data center executives – IO Chief Operating Officer Rick Crutchley, Marvin Rowell of BASELAYER, Vertiv VP of marketing & strategy Amy Johnson and Jeff Klaus, GM of Intel Data Center Software Solutions – discuss how the data center business is being affected by the ongoing industry consolidation.

The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.

Data Center Frontier:  Advanced data centers were early adopters of the Internet of Things, using sensors to detect and manage temperature and humidity. What’s the current state of instrumentation in most data centers, and how is that influencing the software decisions that end users are making to manage their IT environments?

AMY JOHNSON, Vertiv

Amy Johnson:  The early adoption of sensors and environmental monitoring in the data center delivered a coarse overview of the data center’s thermal profile. It was better than nothing, but still part of an outdated shotgun approach to thermal management. It wasn’t uncommon for data center managers to make cooling decisions based on data from a single sensor in a room with hundreds of racks. This was consistent with the old approach of lining cooling units around the perimeter of the data center and blasting cold air indiscriminately into the room. Today’s thermal management strategies, sensors and systems bear little resemblance to those early deployments.

Now we often have dozens of sensors – environmental and otherwise – in a single rack, delivering detailed data on temperature, humidity, power usage and capacity and more. This sort of information can enable a much more elegant approach to data center management, but only if the data can be analyzed and acted upon quickly and intelligently. Information without context or direction has little value. Even well-intentioned automatic break/fix alerts aren’t sufficient. Our customers expect their data centers not to tell them something is broken, but to tell them what to do to fix it or, better yet, how to address it before it breaks.

That’s where the industry is today. Increased connectivity across the network has shifted sensors from being viewed in singular function to being connected to all the other sensors and devices in the data center. These devices contain intelligence and integrated technologies that allow them to gather insights and feed them into a broader management tool for analysis. Sensor data aggregated through a gateway allows business to capture and view the entire IT ecosystem and make intelligent – even predictive – operational decisions.

One final note: An important piece of this monitoring and management puzzle is usability. Customers demand easy-to-use tools for managing their data centers, and it’s up to us to develop tools that quickly collect and analyze data and deliver easy-to-use alerts and recommendations that simplify complex, difficult decisions.

JEFF KLAUS, Intel

Jeff Klaus: IOT inside the datacenter has amassed enough adoption to be considered mainstream. Instrumentation that reports performance, health and operating data is considered a necessity. Most operators are now considering how they combine that information with artificial intelligence to drive faster decisions or identify potential interconnected data points.

For a while, I suspect that combination of data and analysis will be in an advisory role to the operator… who will be making the final decisions…

Going further, future discussion will be around the types of algorithms that work best for data center operations, and how to manage the analysis done in generating the decisions.

RICK CRUTCHLEY of IO.

Rick Crutchley: The data center infrastructure management (DCIM) market is continuing to evolve, and adoption has continued at a steady pace. But the gap between efficient and inefficient data centers is widening, forcing many corporate data centers to either significantly improve their operations or reduce their data center footprint in favor of colocation.

Driving DCIM growth are the pressures to reduce energy usage and cost, simplify capacity planning and management, and improve availability. Above all, DCIM provides needed visibility into data center operations.

The self-contained data center modules – where physical infrastructure is pre-installed and ready for IT integration – are increasingly becoming the gold standard for data center deployment because of their easy scalability and manageability. Inherent in many of these modules is a DCIM solution.

What’s new is the ability to allow colocation customers to have visibility into their own data center module with integrated module views, global visibility, and advanced transparency features for advanced monitoring of their IT stack, automated control of their operations, improved metrics tracking, and more.

Securely monitoring, tracking, and maintaining the core aspects of their infrastructure can even be done remotely, from any device, via a single unified console. With this kind of control, IT managers may find it easier to sign on to colocation than to continue to maintain legacy operations that don’t provide the same visibility.

MARVIN ROWELL, BASELAYER

Marvin Rowell: In many enterprise data centers, we see a larger number of disparate or custom Industrial Internet of Things (IIoT) systems deployed to provide traditional data center metrics (space, power, and cooling). However, just like in the IoT market, I do not envision there will be a “winner take all” platform across all use cases; varying layouts ranging from traditional raised floor to modular will result in different sensor packages for different needs.

As a result, software decisions should focus on selecting tools which combine data from multiple sources seamlessly. These tools should also bring in third-party external feeds to allow operators/users the ability to correlate what is happening inside the four walls of a data center with the macro events outside of it.

NEXT: Instrumentation and Software in the Data Center

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

SeventyFour / Shutterstock.com

Improve Data Center Efficiency with Advanced Monitoring and Calculated Points

Max Hamner, Research and Development Engineer at Modius, explains how using calculated points adds up to a superior experience for the DCIM user.

White Papers

DCF media kit 2022

Data Center Frontier Media Kit

Oct. 16, 2021
Data Center Frontier is ideal for companies that want to be seen as a thought leader in the data center industry. The programs include opportunities to build awareness, submit...