Jeff Klaus, General Manager of Intel Data Center Management Solutions, explores how for remote facilities at the edge of the network, DCIM Solutions are key to supporting enterprise Internet of Things (IoT) and big data.
Although the lion’s share of media attention has focused largely on IoT consumer applications, some of the most exciting innovations of late have occurred within the business sector where the combination of sensor data and advanced analytical algorithms has allowed enterprises to streamline business operations, boost productivity and develop leading-edge products. According to Gartner, the confluence of the IoT, Artificial Intelligence (AI) and 5G is creating a surge in data production that could see more than 7.5 billion connected devices in use in enterprises by 2020.
Many of these devices will be embedded in industrial equipment, machinery, manufacturing, health monitoring systems, and supply chain and logistics applications. Take for example one aerospace manufacturer whose employees now use IoT wearables and augmented reality (AR) tools on wiring-harness assembly lines, which has resulted in as much as a 25% improvement in productivity.
These types of industrial IoT applications will require extending connectivity from the enterprise and the cloud to devices at the edge of network. In fact, it’s common that an industrial IoT deployment involves tens of thousands of sensors and devices deployed across multiple sites.
While 91 percent of today’s data is created and processed inside centralized data centers, Gartner forecasts that by 2022 approximately 75% of data will need analysis and action at the edge. Yes, edge computing offers the ability to bring data processing and storage closer to the growing number of connected devices, thus reducing latency. However, without the right tools at hand, enterprises will struggle to maintain control over their vast and complex data centers, making edge computing strategies more difficult to adopt.
Maxed Out in Capacity and Desperately Needing Data
Just recently, Intel and Schneider Electric conducted a survey of 250 IT decision makers across the United States and the United Kingdom, including but not limited to IT directors and system administrators, data center and operations managers, and application and systems engineers. What emerged is a picture of data center managers in an increasingly precarious situation, expected to protect rapidly expanding volumes of data and a growing number of mission-critical applications, while managing a growing number of remote locations and implementing a variety of emerging environmental initiatives.
Today, many data centers are maxed out in power capacity and poor thermal design leads to hot spots that limit rack loading.
What also became clear is that the days of having a dedicated staff at remote sites that run local services and applications are now in the rear-view mirror. In fact, 59% of respondents are managing more than five remote locations and 19 percent are managing 50-plus. Most organizations are moving a small number of servers into a remote location to process larger workloads closer to the problem.
Today, many data centers are maxed out in power capacity and poor thermal design leads to hot spots that limit rack loading. These constraints make getting insights into power and thermal monitoring, data center health and energy efficiency, as well capacity planning critical to IT administrators. The survey found that 87 percent of respondents are collecting data from the IT, power and cooling assets in their data center and 79% are performing efficiency-focused analytics on their facility. Immediate alerting for component failure, and nightly reporting on drive capacity as well as component health trending data for predictive analytics is also essential.
While 62% of the respondents use a Data Center Infrastructure Management (DCIM) solution, and 67 percent built their own analysis tools to review the collected data, no less than four out of ten respondents indicated it is somewhat difficult to get data from the IT, power and cooling assets in their data center. Another seven percent of the respondents indicated that it is very difficult to do so.
With 94% of respondents using insights on the health and efficiency of their data center when making decisions, it’s clear that without the tools to help these teams gather critical information like power and thermal consumption, costs and complexity will increase at the same or even a greater rate. For the 38% of respondents that indicated they were not using a DCIM tool, they are hamstrung in decision-making ability with a lack of insight into power and thermal monitoring, data center health and energy efficiency, as well capacity planning.
Data Center Modernization Made Easy
As the survey makes clear, IT managers and directors are struggling to meet the challenge of processing, analyzing, and making data actionable and meaningful. Moreover, as data moves to the edge, it becomes even more important for making real-time decisions.
The good news is that of the 87% of respondents that are collecting data from the IT, power and cooling assets in their data center, 64% are collecting data on the edge. Smaller organizations are the least likely to deploy an edge solution, with 27% indicating that they yet to do so.
When it comes to the edge, centralizing work is too costly, so it’s cheaper and faster to work on the data at the edge where it is collected. That said, 54% of respondents are using one to three tools to collect data from the edge and 32% are using four or more tools. With so many tools being used to collect data, the need for a combined solution at the edge to scale and automate is self-evident. Tools such as Data Center Infrastructure Management software are the easiest way to modernize data centers and have the ability to support more IoT and big data-generated information.
Jeff Klaus is General Manager of Intel Data Center Management Solutions.