It’s Time to Reimagine and Reinvent the Data Center

May 7, 2018
Colocation allows businesses to shift staff and resources from data center management to focus on IT infrastructure and business priorities.  In this edition of Voices of the Industry, Randy Rowland, President of Data Center Services at Cyxtera, explores the evolving data center and what’s in store for the future. 

In this edition of Voices of the IndustryRandy Rowland, President of Data Center Services at Cyxtera, explores the evolving data center and what’s in store for the future. 

Randy Rowland, President of Data Center Services at Cyxtera

Regulated industries like healthcare and financial services as well as manufacturing, retail and the public sector, typically need full control over their infrastructure and data. For those, infrastructure colocation makes sense.

Colocation allows businesses to shift staff and resources from data center management to focus on IT infrastructure and business priorities. These businesses leverage the expertise and economies of scale a colocation provider offers as well as gain improved uptime and availability. It’s also more cost effective as organizations avoid large capital outlays or long-term leases for building and refreshing data center infrastructure. In addition, they get better access to network bandwidth than what’s available in an in-house data center. Capacity is scaled as needed so building or leasing plans can shift from a 10 to 15 year horizon to a one to five year colocation service contract.

Since the start of the colocation industry in the 1990s, IT equipment has advanced significantly, following Moore’s Law, doubling performance every 12 to 18 months. Network bandwidth has also grown exponentially. However, with the exception of incremental energy efficiency improvements, the colocation service itself – space, power and cooling – and how it’s provisioned, has changed very little in decades.

The Achilles’ Heel of Colocation

While colocation is an ideal option for many, there are some limitations.

Unfortunately, planning, procuring, and deploying new IT infrastructure can be a time-consuming process. It can take three to six months to deploy a single application, from provisioning circuits to building out infrastructure in a colocation cage, to installation and configuration of the hypervisor, and finally the loading and testing of the application. It simply takes too long. Enterprise IT should be focused on supporting and driving the business, not spending time with cable crimpers and screwdrivers in hand, installing and configuring their IT infrastructure.

Traditional IT infrastructure builds in colocation require organizations to design and provision capacity for peak loads on day one. Quickly scaling up and down isn’t possible. If there isn’t enough storage capacity, IT needs to buy additional disks and storage controllers.  If the workload is memory or processor bound, additional servers or blades are required.  These periodic large capital outlays and large step function expansion is a CapEx heavy model that contributes to high operating costs.

These limitations cause many to turn to public cloud. But while there are many benefits with the cloud, it has its limitations as well.

Cloud isn’t Always the Answer

With public cloud, businesses avoid capital outlays, and can rapidly deploy and scale as needed.  There is no need for onsite data center staff and often there are no long term contracts. But there are issues here as well.

One of the main concerns is the risk associated with and lack of control of shared multi-tenant IT infrastructure. Despite its obvious benefits, public cloud’s shared infrastructure remains a point of contention for many due to fears about host data breaches, outages or service degradation that are out of their control. Additionally, some applications just are not ready or architected properly for the cloud.

Costs come into play too. While cloud can be cost-effective in the beginning, cloud bursting capabilities come at a premium. So if you’re simply running a steady-state workload in the cloud, the financial benefits aren’t nearly as compelling.

It’s time for On-Demand Colocation

What if dedicated hardware and network infrastructure could be more cloud like in its deployment and consumption model? Enterprises would no longer have to choose between control and agility.

It’s time for groundbreaking innovation in data center colocation. Leveraging a software-powered architecture, the time it takes to deploy dedicated IT environments can be shortened dramatically – from three to six months down to a matter of days. This includes everything from hardware design and procurement, colocation space selection and build out, network provisioning, hyperconverged infrastructure and hypervisor installation.

In this new normal, all colocation costs are OpEX instead of CapEx. This helps to eliminate the complexities of procurement, logistics, and capital equipment management. Organizations can take advantage of on-demand deployment and consumption, speed time to market and run “headless” datacenter operations but avoid the risks and loss of control associated with public cloud platforms. Finally, on-demand colocation eliminates the need for organizations to over-purchase capacity for peak-loads so you don’t have to pay for expensive unused infrastructure.

As decisions are made as to which of your applications belong in the public cloud and which are best operated out of the data center, you shouldn’t have to compromise agility, flexibility or control of your infrastructure.

It’s time to improve the colocation experience by making it much more on-demand, elastic and software enabled.

Randy Rowland is President of Data Center Services at Cyxtera. Connect with Randy on LinkedIn.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Tackling Utility Project Challenges with Fiberglass Conduit Elbows

Explore how fiberglass conduit elbows tackle utility project challenges like high costs, complex installations, and cable damage. Discover the benefits of durable, cost-efficient...

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Siwakorn1933/Shutterstock.com
Source: Siwakorn1933/Shutterstock.com

Vendor Diversification vs Vendor Consolidation: What’s the Best Way to Ensure Supply Chain Resilience for Your Data Center Construction Projects?

Joey Wagner, Program Management Subject Matter Expert for Blueprint Supply Chain, outlines the benefits of two supply chain vendor strategies and explores how each can impact ...

White Papers

Chatsworth Cover 2023 08 07 11 57 53

The Data Center Innovation Will Change the Way You Think About Liquid Cooling

Aug. 7, 2023
The demand for high density servers and high-performance computing continues to grow – as does the amount of heat generated by all this computing power. Data center operators ...