Colocation: The Business & Technical Factors Driving Adoption

April 8, 2020
Businesses increasingly depend upon secure, reliable availability and high-speed connectivity as they pursue digital transformation. The third entry in a new special report series focuses on the business and technical drivers behind the move to colocation. 

A new special report from CoreSite and Data Center Frontier, we take a look at how colocation can be the “nervous system” of today’s modern digital businesses. The third entry in this special report series focused on the business and technical drivers behind the move to colocation. 

Get the full report. (Report Cover Image: Kkssr/Shutterstock)

Businesses increasingly depend upon secure, reliable availability and high-speed connectivity as they pursue the goal of digital transformation. Downtime is no longer a luxury. One recent study found that the cost of an hour of downtime exceeds $300,000 in most cases and can run over $5 million for some businesses.

Poor performance also carries a penalty. Another study reported that a one-second delay in webpage loading time results in 11% fewer page views, a 16% drop in customer satisfaction ratings and 7% fewer conversions. Customers said they expect a webpage to load within two seconds on average before they consider abandoning a site. 

IT is also no longer exclusively a back-office function. Many companies now conduct large amounts of business online, meaning that the speed and availability of their production systems directly impact relationships with suppliers and customers. With e-commerce expected to be a $740 billion market by 2023, the need for high-performing and available servers is a cost of doing business. 

Digital business places new demands on IT infrastructure in other ways as well. 

  • Website performance and availability is critical to establishing customer confidence as well as articulating brand value and mission. In the early days of the Internet customers learned to tolerate occasional outages to get the information and services they needed. Today they simply leave. 
  • Organizations trade data with business partners as part of the management of their supply chains. Data must frequently be exchanged in near real time to expedite deliveries, track shipments and identify fraudulent transactions. Latency and downtime are big problems.
  • Marketing organizations subscribe to a multitude of data services that provide information that can be used for customer profiling and personalization. The data volumes can be very large and multiple streams often have to be merged into a single database.
  • Online advertising requires moving and analyzing massive amounts of data for split-second programmatic decision-making.
  • Fleet management, package tracking and logistics systems often require up-to-the-second data about the location of assets to enable operational efficiency.
  • In some industries, like financial services, latency is the difference between success and bankruptcy.

Digital business is also redefining relationships that were once a patchwork of point connections into an integrated mesh in which organizations may transact business with each other constantly, occasionally or sporadically. Connections are unpredictable and opportunistic. Arbitrage systems compare prices and availability from multiple suppliers to enable transactions to be performed at the lowest cost.

Applications built on microservices or serverless platforms must access data and services from a multitude of sources to work property. Cloud workloads and data needs to be shifted seamlessly between platforms to give customers the best balance of performance and cost. These use cases and many others require fast, fluid connections that can be set up and torn down in seconds.

Organizations that adopt SaaS recognize that installing, patching and upgrading software isn’t a core competency and are eagerly outsourcing that function to specialty vendors.

At the same time, digital transformation is prompting organizations to carefully assess core competencies and choose which operations to keep in house and which to outsource. SaaS, which at $85 billion is by far the largest segment of the overall cloud market, is emblematic of this trend. Organizations that adopt SaaS recognize that installing, patching and upgrading software isn’t a core competency and are eagerly outsourcing that function to specialty vendors.

The custom applications that enterprises create are increasingly collections of SaaS services woven together in unique ways. Reliable, high-bandwidth connections are essential to supporting this new functionality.

The Technical Drivers of Colocation

As noted earlier, the IT world is increasingly multi-cloud and hybrid cloud. IT leaders want maximum flexibility to deploy workloads where they make sense, to provision additional infrastructure for peak load periods and to shift workloads easily between on-premises and multiple cloud platforms.

Point-to-point connections between entities over the public Internet or privately provisioned lines are erratic and expensive. Shared connections are vulnerable to traffic bursts that may impact performance for everyone as well as anomalous events such as denial of service attacks. Cost is also an issue. The faster and more reliable the connection needs to be, the higher the price tag. Latency is inherent in connections that traverse long distances, making the public Internet a risky place to do business where time is a factor.

In many cases it is physically impossible to reliably transfer multi-terabyte-sized files over a standard Internet connection at the speed the business demands.

In some applications, conventional Internet connections can never provide the reliability and performance that is required. For example, advertising networks thrive on high-speed communications. In many cases, brokers can deliver ads on a page before the rest of the content even loads. The faster the throughput, the more quickly ads can be matched to available inventory and served on a timely basis.

Another throughput-intensive application is the transfer and processing of media such as video and high-resolution images. Film editors, for example work with multi-terabyte-sized files that may need to be shared with partners in other parts of the world to enable “follow the sun” editing. In many cases it is physically impossible to reliably transfer those files over a standard Internet connection at the speed the business demands. Architectural firms, ad agencies, photo bureaus and mapping firms are among the many other types of businesses that work with extremely large file sizes.

A network architecture that best reflects the structure of modern business uses a distributed model in which data is processed close to the point of origin or decision. A good analogy is commercial aviation. There are more than 5,000 airports serving the general public in the U.S. alone. If all travel was point-to-point between them, the cost and complexity would make flying prohibitively expensive. The airline industry long ago adopted a network model based on large hubs that connect to smaller airports. While it may take longer for travelers to get to their destination, they have more scheduling options, greater flexibility to re-route around adverse weather conditions and much lower costs.

Each hub can house multiple customers who can connect with each other directly as well as with business partners in other locations over the high-speed backbone.

Fortunately, distributed networks can have the best of both worlds. High-speed backbones between network hubs ensure that traffic flows smoothly and quickly. Each hub can house multiple customers who can connect with each other directly as well as with business partners in other locations over the high-speed backbone. Messages routed through central hubs can be directed more efficiently and less expensively to their destinations because all traffic is managed centrally. However, customers aren’t precluded from connecting to each other independently of the hub. They can move processing to the edge and use the high-speed backbone only when needed.

As a result, costs are lower because traffic doesn’t have to traverse leased lines or the public Internet. A hub enables customers to connect to each other, to value-added service providers and to major cloud platforms. The result is a cost-effective, secure and high-performance alternative to long-distance dedicated connections from one on-premises data center to another.

Catch up on the first and second entry in this special report series that explores the following:

And stay tuned. This special report from CoreSite and Data Center Frontier will finish up with the following post next week:

  • Understanding interconnection + why/when to use a colocation provider

Download the full report, How Colocation Can Be the Nervous System of Digital Business, courtesy of CoreSite and Data Center Frontier, that explores how colocation might just be the answer to remain competitive in today’s markets. 

About the Author

Paul Gillin

Paul Gillin is a speaker, writer and technology journalist who has written five books and more than 400 articles on the topic of social media and digital marketing. A technology journalist for 25 years, he has served as Enterprise Editor of the tech news site SiliconAngle since 2014. He was founding editor-in-chief of B2B technology publisher TechTarget, and served as editor-in-chief and executive editor of the technology weekly Computerworld.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Gorodenkoff/Shutterstock.com
Source: Gorodenkoff/Shutterstock.com

Transforming Data Center Management with Centralized DCIM Insights

Ray Daugherty of Modius explains why centralized data management is vital for robust data center governance.

White Papers

Dcf Service Express Wp Cover 2021 12 17 9 16 03 232x300

2022 Data Center & Infrastructure Report

Dec. 20, 2021
Service Express reveals the results of their survey of 700 US IT professionals in the 2022 Data Center & Infrastructure Report. Key findings reveal the continued need for strengthening...