The Data Center Frontier Executive Roundtable features insights from industry executives with lengthy experience in the data center industry. Here’s a look at the insights from Erich Sanchack of Digital Realty.
Erich Sanchack is EVP of Operations at Digital Realty, responsible for overseeing global portfolio operations, global construction, colocation and interconnection service implementation as well as supply chain operations. Sanchack has extensive experience in operational and business development roles in the technology and telecommunications industries, including building and operating data centers. He previously served as Senior Vice President, IT Solutions and New Market Development at CenturyLink where he was responsible for global commercial managed services and IT service offerings. At CenturyLink, he also held the title of Senior Vice President and General Manager, Federal, where he was responsible for the company’s government sector portfolio encompassing network operations, cybersecurity and security operations, managed hosting with cloud and data management services, and a wide array of communications and IT services. Prior to CenturyLink, Sanchack served as an executive at Lockheed Martin for 12 years, where he held various Vice President roles.
Here’s the full text of Erich Sanchack’s insights from our Executive Roundtable:
Data Center Frontier: This year we have seen strong demand for data center space in international markets. What are the biggest opportunities and challenges for data center companies in operating at global scale and working with multi-national clients?
Erich Sanchack: As more colocation data center companies look at scaling globally, the ability to access data center resources in a way that is consistent to the customer will become increasingly important. From the customer perspective, large multi-national companies are starting to look at the advantages of working with one colocation provider across several regions.
As organizations look to scale, they need a provider that can grow with them to allow them to reach their full potential and keep up with the evolution of their business models. Providers that offer a range of solutions across edge, hyperscale and interconnection capabilities will win.
There are challenges when implementing a global model. For example, we see a trend in security and privacy issues becoming more specific to a region, making data and network compliance more complex. Take a regulation like GDPR: It is increasingly difficult to guarantee that certain data is only stored and used in particular geographies when that level of specificity and location goes against the core benefit of interconnectivity.
Additionally, when certain tools and data sets are legally constrained to a particular country or region, promoting global interconnection can pose a challenge. Data center providers must now look at security holistically to ensure we meet both global and local mandates and can act as a consultant to our multi-national clients. We are now tasked with ensuring we can protect their assets and data effectively, without risking the gains clients want to achieve from a globally connected network.
Data Center Frontier: There’s currently huge interest in interconnection and network services. What are the most significant trends in the network features customers are seeking, and how providers are delivering these services?
Erich Sanchack: As every business becomes a digital business, point to point connection doesn’t cut it – businesses need near real-time interactions with a wide ecosystem of business partners and cloud providers, with greater security and reduced latency. Part of this trend is a growing realization that business innovation cannot happen solely within an organization and will likely need an evolving group of business partners and suppliers to establish future digital business models.
With this shift, we’re seeing a bigger trend away from dedicated connections to the outside world toward the use of flexible SDN connections from customer cabinets through to systems in the meet-me room. By using connections such as our Service Exchange, customers are able to make virtual connections to a wide range of clouds which can be established quickly through a software portal.
As organizations look at different cloud environments for different applications, software-defined ways to interconnect become key to optimizing the cloud traffic to reduce latency, simplify connectivity and provide real-time, reliable access to disparate data sources.
Connecting to multiple clouds across a traditional hub-and-spoke WAN infrastructure is costly and can be unreliable. Using architectures such as Service Exchange for interconnection makes the use of multiple clouds happen securely and with low latency – optimal for emerging applications like gaming or financial services.
The interest in interconnectivity is also driving a change in service-level agreements. Uptime alone, is no longer good enough. End-users want the increased speed from interconnectivity, because a slow network has effectively the same result as if the entire network is down.
Finally we’re seeing an interest from clients in interconnectivity driven by the proliferation of virtualization. This year analysts expect one out of every $10 spent on software to be devoted to virtualization. However, a network with a cloud interconnection foundation must be in place to ensure reliable access to bandwidth for virtualized applications.
Data Center Frontier: The speed of data center deployment is accelerating, with innovation in the supply chain and how facilities are built and leased. What do you see as the most important issues to address to keep pace with the rapid growth of digital infrastructure?
Erich Sanchack: We see two areas we need to address to keep up with the growth in the market. The first is the definition of high-density. With emerging applications like AI and machine learning becoming a mainstay in today’s enterprise, we have to rethink what makes the data center AI ready and if it can support high-density workloads. From the power management, to new cooling requirements to keep the data center stable and running, many facilities today are not up to the task. This challenge is leading many to look to their data center partners for a purpose-built infrastructure ready for advanced computing applications.
Another factor to consider is that many of the digital-native businesses are looking to minimize their environmental footprint and want access to sustainable energy in the facilities they select. Digital Realty has already taken a number of key steps on this path and is already fueling its US Retail Colocation business on renewable power.
Finally, we can’t ignore the people aspect of our industry. Today the talent required to build advanced facilities is incredibly scarce. The staffing needs of data centers are very complex, ranging from facility managers to manage the power supply and cooling, to those that manage the networking technology to enable better data flow. With many that started at the advent of the industry retiring, providers need to invest in training to skill up new workers coming in, and automation to automate routine tasks and augment the abilities of the existing workforce to keep up with demand.
Data Center Frontier: What lies ahead for data center automation? What are the key opportunities in using data (and perhaps even AI) to make data centers more efficient and productive?
Erich Sanchack: Automation is critical to addressing the growing workforce shortage we see in the industry, and most importantly to help guide our customers through the onboarding process in the most efficient and consistent way. We believe that embracing more automated processes will allow data center operators to concentrate more on higher-order, high impact tasks that require human oversight. Power management, service provisioning and predictive maintenance are three areas that are ripe for automation applications.
Automation tech like AI and machine learning can easily manage the cooling and power management requirements of the data center – optimizing based on the applications and tenants and improving energy efficiency. AI can help model future demands for power consumption, which will reduce energy costs and improve the overall system reliability. In this case, automation can give upwards of 5 percent gains in efficiency, having a big impact on the data center’s environmental footprint and profitability.
In predictive maintenance, machine learning can also play a big role. Once we have enough data on the facility available, machine learning can take the guesswork out of the maintenance operations, aligning on equipment types, power usage, performance and incident data and reporting back to prevent failures before they occur. This is a largely untapped use-case today, but can give us huge gains in productivity and uptime industry-wide, while reducing waste and the associated costs.
We also see an upward trend in the adoption of data center management platforms that automate management and access for customers. These types of portals allow customers to ‘self-serve’ their virtual connections, reducing the need for trained traffic engineering staff and delivering a more consistent experience across locations.