Edge Computing is Poised to Remake the Data Center Landscape

Nov. 19, 2018
Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

Data is everywhere. As a digital transformation sweeps across our society and economy, Internet infrastructure must follow the data. Enter edge computing, which extends data processing and storage closer to the growing universe of devices and sensors at the edge of the network.

Download new Special Report on Edge Computing. 

The goal of edge computing is to process data and services as close to the end user as possible. It’s an architecture that allows the compute and content delivery process to happen within 10 milliseconds or less of the user. The trend driving the edge computing model is the increased use of consumer mobile devices, especially consumption of video and virtual reality content and the growth of sensors as part of the Internet of Things.

Today’s data center leaders are looking for new ways to deploy edge capacity and support users, including businesses and consumers.

One of our core theses at Data Center Frontier is that the convergence of cloud computing, Big Data and the Internet of Things will require the data center industry to expand well beyond the traditional “Big Six” markets – Northern Virginia, New York/New Jersey, Chicago, Dallas, Silicon Valley and Los Angeles – that have traditionally been magnets for third-party service providers.

So why are we hearing about “edge” everywhere we go these days? The answer is simple: because users are demanding it.

According to a recent AFCOM State of the Data Center Industry study, edge solutions are one of the top areas of focus for data center end users. Forty four percent  have already deployed some form of edge computing capacity or say they will be doing so over the next 12 months, according to the study. Look out over the course of the next three years, and another 17 percent of respondents have edge computing on their business plans.

This growth is also evident in the amount of density the industry is seeing with the edge data center itself. According to the AFCOM report, the estimated mean average power density in edge compute deployments is 6.7kW per rack. Edge computing is making waves and serving to change the way data centers are built and where they are deployed.

Recent research from Cisco outlined some of the biggest factors behind why edge is booming, many of which have to do with connectivity and mobility:

  • 4G systems are already growing constrained.
  • Smartphone traffic will exceed PC traffic by 2021.
  • Traffic from wireless and mobile devices will account for more than 63 percent of total IP traffic by 2021.
  • Global mobile data traffic grew 63 percent in 2016.
  • Almost half a billion (429 million) mobile devices and connections were added in 2016.
  • Mobile network (cellular) connection speeds grew more than 3-fold in 2016.

And the introduction of 5G wireless connectivity will accelerate the trend of edge data center networks extending their reach closer to end users than ever before.

The Intersection of Edge and ‘Connected Devices’

Much of the shift to the edge can be attributed to users and IT platforms being so distributed, as well as the rise of the Internet of Things. Data center and business leaders are actively investing in IoT solutions, specifically those that support IoT devices and all of the users accessing that data.

Edge computing will take data everywhere, including the floor of the ocean, as is the case for Microsoft’s Project Natick deployment in Scotland. (Photo by Scott Eklund/Red Box Pictures for Microsoft).

Edge is also being used to address the latency challenges of today’s highly distributed environments. Latency is the time required to transmit a packet across a network; or  to a business, latency can also mean the loss of business or a competitive edge. For the tech giants, like Amazon, the company found every 100ms of latency cost them 1 percent in sales.

You don’t need to be the size of Google or Amazon to feel the impact of network slowness. Latency needs to be addressed, and edge computing helps remove network barriers and tackle latency challenges.

Edge computing can also help with a variety of different tasks as edge solutions revolve specifically around the use-case. From delivering applications to enabling virtual desktops to delivering data users or their systems, the edge can help. Here’s a rundown of some of the processes edge computing can assist with:

  • Software-defined solutions that can provisioned based on the needs of your application
  • Branch and micro data centers
  • Hybrid cloud connectivity
  • IoT processing
  • Firewall and network security
  • Internet-enabled devices and sensors collecting and analyzing real-time data
  • Connect entire networks of devices
  • Asset tracking
  • Streamline research
  • Pharmaceutical, manufacturing, corporate inventory
  • Reduce latency for specific services
  • Supporting delivery requirements for latency sensitive data points and applications

Edge Computing Flexibility & Ops Challenges

It turns out that edge computing is evolving in tiers, with opportunities in regional data hubs, small cities at telecom towers and on devices. Think of it in terms of “edge, edgier and edgiest.”

These distinctions are still evolving, and are being closely tracked by data center providers, who are seeking to clarify their strategies for emerging opportunities in edge computing. Companies are deciding where they’re going to play, and when the business will arrive. Just as there are many edges, there are many ways to compete and capture business.

In many ways, edge computing is entering a phase similar to the “cloud muddle” of 2012-14, with dueling notions of how to define the technology. We believe that as with cloud computing, there will be many flavors of success with edge computing, writ both large and small.

A shift to more distributed infrastructure, even if it occurs gradually, presents both opportunity and disruption. Data center, tower companies and mobile operators are deciding how best to build and monetize their edge computing infrastructure. Edge computing is evolving into four segments, which interact and overlap:

  • Data centers in regional markets and smaller cities
  • Micro data centers at telecom towers
  • On-site IT enclosures and appliances to support IoT workloads (often referred to as the “fog” layer)
  • End-user devices, including everything from smart speakers to drones and autonomous cars.

The entire concept of edge is to “meet the users where they are.” In that, flexibility is key, and edge computing offers tremendous potential benefits in terms of how you deploy edge solutions and manage data.

For example, today, you can deliver modular edge data center infrastructure solutions which provide standardized deployment options, and can give you the flexibility and capability to meet today’s data center demands. From the customer’s perspective, edge computing can be any services or architecture which helps you simplify and localize the delivery of applications, data sets, and services. This means that edge design is flexible, and specifically caters to high-performance or even latency-sensitive applications

Getting Started with Edge

As edge computing creates demand for different types of facilities in different places, the design and capacity of local infrastructure will be guided by the workloads. As the volume of data grows, and that data moves across the network, the growth of edge computing creates a ripple effect that could generate business, extending hundreds of miles from the edge facilities.

An SRP DataStation housed at the intersection of utility transmission lines offers one vision of edge infrastructure. (Photo: SRP)

Edge computing is not a single technology, but a phrase that describes several layers of infrastructure, some of which are refinements of existing models. Edge infrastructure is a response to new technologies – such as autonomous vehicles and distributed AI applications – that require low latency and close proximity to users. These technologies are shaping the future of Internet infrastructure,

For edge computing to be deployed everywhere at scale, it will need to be compact, cheaper and take the network to new places. The key components of edge computing include mobile devices, wireless networks, telecom towers, small cells and distributed antenna systems (DAS) as well as data centers and cloud platforms. It’s a universe that transcends traditional business models built on specialization.

For many, this will be a bit of a shift, but one that could be beneficial, as the edge isn’t like your traditional data center. These are agile solutions capable of shifting based on your demand, providing more flexibility for modern businesses.

For the latest and most in-depth coverage of edge computing and for more on how the edge is impacting the colocation and data center industry, see below for Data Center Frontier’s latest coverage, including articles special report series. 

About the Author

Bill Kleyman

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. Bill is currently a freelance analyst, speaker, and author for some of our industry's leading publications.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

CoolIT Systems
Source: CoolIT Systems

Selecting the Right Coolant Distribution Unit for Your AI Data Center

Ian Reynolds, Senior Project Engineer with CoolIT Systems, outlines considerations for selecting the best CDU for your needs.

DCF_October
ID 316930445 © Sf1nks | Dreamstime.com
dreamstime_m_316930445
mainspring_webinar_linkedin

White Papers

Get the full report

Content & Digital Media Infrastructure

April 18, 2022
Media and entertainment companies are now at a digital media infrastructure crossroads thanks to the rise of streaming services during the pandemic. Iron Mountain Data Centers...