Executive Insights: Gary Niederpruem of Vertiv

The Data Center Frontier Executive Roundtable features insights from three industry executives with lengthy experience in the data center industry. Here’s a look at the insights from Gary Niederpruem of […]

The Data Center Frontier Executive Roundtable features insights from three industry executives with lengthy experience in the data center industry. Here’s a look at the insights from Gary Niederpruem of Vertiv (previously Emerson Network Power).

GARY NIEDERPRUEM of Vertiv

Gary Niederpruem is Vice President, Global Marketing and Strategy for Vertiv, where he is responsible for leading the global marketing effort including devising strategic programs, M&A identification, driving a consistent brand strategy and communicating the overall value proposition of the business.

Gary began his career at Emerson as a Product Specialist in 1996 and has held increasingly responsible positions including Product Manager, Director of Account Management and Director of Product Management. In 2010, he became Vice President of Marketing and General Manager of the Integrated Modular Solutions business within Energy Systems. In 2014, he was named Vice President of Global Marketing for Emerson Network Power and in mid 2016 he assumed oversight for the strategy function.

Gary attended John Carroll University where he received a Bachelor’s degree in Marketing and Logistics. He also has his Master’s degree in Business from the University of Notre Dame. Gary serves on the board of TIA (Telecommunications Industry Automation) which is the leading communications networks industry association.

Here’s the full text of Gary Niederpruem’s insights from our Executive Roundtable:

Data Center Frontier: Discussion of industry trends is dominated by the rise of the cloud computing model. How has this cloud-driven disruption impacted the broader data center industry? What are the pros and cons?

Gary Niederpruem: The rise of the cloud is real, and while there is a general “cloud computing model,” I’d caution against thinking about it in a static, cookie-cutter fashion. The reality is no two cloud providers are operating the same way. Each one has different applications, different infrastructure architectures and different needs. That’s driving innovation across the industry, and we’re seeing that cloud facilities have become the breeding ground for new data center technologies and designs.

Increased cloud adoption is having an undeniable impact on traditional enterprise data centers, but it’s far too early to sound the death knell. There still is a tremendous amount of investment going into the enterprise segment, but certainly the growth rate of cloud computing is much more robust. That’s not to say private networks are a thing of the past; quite the contrary. But company-owned computing looks different in a cloud-dominated world. Hybrid cloud architectures that leverage private and public cloud resources are increasingly common. And more and more businesses are deploying distributed networks that localize computing at remote sites while maintaining some organization-wide resources in the cloud. This edge computing phenomenon is really just getting started.

Even private, centralized data center facilities are evolving and illustrating the influence of the cloud. The cloud model offered faster access to computing capacity, scalable capacity and reduced capital costs, so data center design had to change to keep pace. We’re seeing more and more prefabricated facilities, deployed significantly faster and more economically, and scaled to meet immediate needs while delivering easy scalability. This is in direct response to the advent of cloud and colocation providers.

The bottom line is simple: Businesses will choose computing solutions that deliver reliability, security, modularity and speed. The model – cloud, colo, enterprise or something else – depends on the unique needs of that company and its customers.

Data Center Frontier: There have been a lot of merger and acquisition (M&A) activity recently in the data center industry. How are these M&A deals influencing the development of the data center industry? Are we likely to see more M&A activity?

Gary Niederpruem: There typically are two inflection points in any industry when M&A occurs. One is early on, when it’s unclear which technology or model will emerge in the market, and companies acquire or merge with others to hedge their bets. The other time is when the industry is more mature, the market has been established, and a company is trying to grow. It can be difficult in that environment to grow organically, so they often choose to grow inorganically through acquisition.

The data center industry has been around for a while, but the advent of cloud computing, colocation, virtualization, the Internet of Things, software-defined networks and so many other emerging trends, the industry has in no way reached a state of maturity. It’s in constant change. That makes it fertile ground for M&A activity, and we’re seeing that among colocation companies and technology providers, among others.

However, because the industry is some 30 years old, we’re also seeing M&A activity consistent with a more established industry – companies seeking growth through geographic expansion, customer expansion or technology acquisition. While the volume of M&A activity in the data center industry isn’t as large as some other industries, I would expect it to continue along those two paths.

Data Center Frontier: The market for tools to monitor, manage and automate data centers continues to evolve. What are the significant trends in this ecosystem and how effectively are customers using these tools?

Gary Niederpruem: There is a lot of necessary activity in this space. Traditional management tools at times have been difficult and time-consuming to install and have done a better job of collecting data than they have of analyzing it and making it actionable. To be fair, the early generations of data center management tools were designed with traditional enterprise data centers in mind, and they have been adjusting to a new world order ever since.

Today’s management tools are much more sophisticated, but striving to be less complex and easier to install. Most importantly, they not only should collect data, they should analyze and react to it in real time. That’s critical for today’s distributed and software-defined networks, and for enabling and leveraging the Internet of Things.

There has been a fundamental and ongoing shift in the approach to network management. In the past, management tools have been add-ons – hardware and software systems installed across the data center to monitor and manage existing equipment. Today, these capabilities are built into much of the critical infrastructure, and management tools need to continually evolve so that connecting with those systems is easy and one can take advantage of improved machine-to-machine intelligence and communication. The software that unlocks these capabilities needs to be increasingly modular and easy to deploy, and the result is a much more streamlined, efficient – and faster – network management process.

Data Center Frontier: The largest cloud platforms are seeking to deploy data center capacity at an accelerated rate. What has this meant to the supply chain for data center delivery?

Gary Niederpruem: Cloud providers are deploying data center capacity in a number of ways. Sometimes they’re building in greenfield sites. Sometimes they retrofit existing facilities. Some are using colocation. A growing trend is the deployment of computing capacity at the edge, creating a distributed cloud network that can reduce network latency by managing loads closer to consumers.

That sort of unpredictability presents challenges across the supply chain. You can manage that unpredictability in three ways:

  • Strive to know your customers and their plans intimately. That allows you not only to react quickly when they make a decision, but to be involved in that decision and help them plan and grow in ways that meet their needs today and prepare them for the unexpected tomorrow.
  • Design solutions with modularity, flexibility and speed in mind. This allows technology providers to move as fast as – and in many cases faster than – their customers.
  • Streamline the supply chain and service capabilities to make them flexible and nimble. For example, Vertiv is a large, global company, and we work daily to make sure that reach is an asset rather than an impediment to speedy delivery. When customers have urgent needs anywhere in the world, well-prepared global service teams can meet those needs with the appropriate resources and expertise almost immediately.

See the entire Data Center Frontier Executive Roundtable for insights from additional executives and topical stories on the latest data center trends.