Ecosystems at The Edge: Cloud vs. Colocation
Last week in our special report series, we looked at why interconnection is key for edge data centers. We also explored some of the factors that should be considered when selecting a colocation provider. In our final article in the series, we’ll compare the cloud versus colocation.
Cloud vs. Colocation
The major cloud computing providers are all aggressively deploying edge infrastructure in similar ways. Their objective is to extend their existing platforms to core and edge data centers so that customers have identical functionality regardless of location. To do this, the big providers are pursuing a two-pronged strategy of installing their infrastructure inside customers’ data centers and also in local telecommunications facilities. There are several advantages to this approach, chief among them being the ability for cloud providers to quickly provision infrastructure that fully compatible with the cloud regions.
However, limitations on size, bandwidth and compute capacity may challenge some to provide their customers with the full range of capabilities needed to build scalable edge architectures. Regional service breakdowns show that service availability can vary significantly by location.
Most of the major public cloud providers don’t operate a network of wholly-owned edge colocation facilities, so they must rely on partnerships and workforces they don’t control. These relationships will take some time to develop and can be fragile.
Edge infrastructure that isn’t designed to be managed remotely or that must be administered by contract personnel can be a risky proposition in far-flung locations. And self-contained miniature data centers are prone to theft, weather damage and unplanned outages. Most companies building out edge networks will want infrastructure that can’t be taken offline by a traffic accident.
Data Center Frontier’s Editor in Chief Rich Miller and executives from Iron Mountain discuss Ecosystems and the Edge. Watch on demand today!
Over time, cloud providers will no doubt address these structural limitations, but some customers may not want to wait. The fastest, safest and most flexible option for them is to locate edge infrastructure in established colocation regions that are fully equipped and staffed by trained personnel and that already support local ecosystems.
- ROI is faster because regional colocation providers already have the facilities and relationships to support edge buildouts.
- Regional colocation centers can scale to meet capacity demands and, in most cases, are fully compliant with relevant regulations.
- The staff is trained and familiar with the equipment they work with every day.
- Facilities are secure and reliable with power, environmental and seismic controls that already meet local requirements.
- The staff speaks the local language and understands the culture and expectations of the ultimate end-users of edge services.
- Connections to nearby services can be quickly facilitated through peering and interconnection.
Most colocation providers also have existing ecosystems of customers and partners to whom new customers can connect. For example, Iron Mountain Data Centers’ rapidly growing ecosystem with hundreds of third-party partners provides specialized network connectivity, access to fiber networks, software-defined networks, streaming services, specialized peering solutions, vertical market expertise and access to data centers in specific geographic and remote locations.
Many established co-location providers have also adopted public application program interfaces for rapid onboarding of customers and partners and have existing backend integration with a wide variety of other providers, including telcos, cloud platforms and colocation partners.
Download the full report, “The State of Data Center Cooling: A Key Point in Industry Evolution and Liquid Cooling” courtesy of TMGcore for an exclusive interview with Paul Gillin, Rich Miller, and Mark Lewis and Mark Kidd of Iron Mountain.