The Blurring Boundaries Between Wholesale and Colo

Jan. 6, 2016
The boundaries between colocation and wholesale data center offerings are blurring. Our Executive Roundtable experts discuss why this is happening, and what it means for the multi-tenant data center industry.

Today we continue our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of three experienced data center executives – Jakob Carnemark of Aligned Data Centers, Rob McClary of FORTRUST, and James Leach of RagingWire Data Centers – will examine trends in the business models for multi-tenant data centers. The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.

Data Center Business Models: Wholesale Meets Colo

Data Center Frontier: Analysts say the boundaries between colocation and wholesale data center offerings are blurring. Is this trend real, and if so, is it likely to continue? How are customers choosing between the different data center procurement models?

Robert McClary, FORTRUST

Rob McClary: It’s very simple. Wholesale data center offerings is a real estate based “landlord to tenant” model. Colocation is a “service provider” model. The lines are blurred because one wants to be the other without having to completely adapt the other’s model. You have wholesale real estate offerings that don’t understand the service provider model and service provider offerings that don’t understand the real estate model. They are two different things.

Both service providers and wholesale providers want the better of the two worlds. The issue becomes more blurry, in large part, because the end-users and customers have an expectation for service provider models, but want the pricing of the other. This trend is real and it is likely to continue.

The problem is that the customer’s expectations are not aligned with the model they are seeking. It’s another educational issue.

James Leach: We are entering the era of “Big Colo” where both data center providers and buyers will be the winners.

James Leach, RagingWire Data Centers

For the data center provider, the primary drivers of Big Colo are economies of scale. For example, large generators and UPSs can support multiple megawatts of load and the incremental costs goes down as the capacity increases. From a financial perspective, the capital expenses required to build a data center are significant – often exceeding $100 million. Data center providers can spread these costs over time using phased build designs and long-term depreciation schedules. The result is Big Colo providers can offer a better product at a lower price.

For data center buyers, Big Colo can deliver scalable power from a few hundred kW to multi-megawatts, sophisticated configurations of dedicated and shared infrastructure, and flexible deployments from racks to cages to suites. Big Colo sites become a hub location for telecommunications and cloud providers offering customers an integrated platform for enterprise systems and internet apps. These Big Colo locations also become the job sites for a broad set of data center services providers delivering on-site moves, adds, changes, repairs and maintenance. The result is data center buyers can lease a portion of a superior facility, paying only for what they use and avoiding the large up-front capital expense to build the facility and operational expense to run the data center.

Jakob Carnemark, Aligned Data Centers

Jakob Carnemark: Customers are looking for flexibility and control. For too long they have been forced to take capacity according to an artificial, fixed-ramp schedule that resulted in them paying for more than they needed and used. This model is nearing the end of its useful life.

Customers have become accustomed to the cloud model of pay-for-use and are starting to demand a similar approach for their colocation requirements. In terms of retail and wholesale, we don’t believe in drawing boundaries. The customer’s needs are changing, IT hardware is evolving, and the cloud is offering them options they have never had before.

Our data centers are engineered to accommodate any scale requirement. We believe in a utility model for compute infrastructure. Customers want this flexibility and control over how they consume capacity, and many times it does not fit neatly into a retail or wholesale definition.

NEXT:  Progress on Aisle Containment for Data Center Cooling

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...