Edge Computing Trends for 2020 & the Next Decade

Jan. 6, 2020
Phillip Marangella, CMO for EdgeConneX, explores how edge data centers can help us rearchitect the internet in a way that will support the flood of data and massive traffic flows generated by emerging technologies like AI, cloud gaming, VR/AR and multi-cloud deployments, and more. 

Phillip Marangella, CMO for EdgeConneX explores how edge data centers can help us rearchitect the internet in a way that will support the flood of data and massive traffic flows generated by emerging technologies like AI, cloud gaming, VR/AR and multi-cloud deployments, and more. Explore edge computing trends top of mind for 2020. 

Phillip Marangella, Chief Marketing Officer, EdgeConneX

The internet was not constructed to handle the traffic flows of today, and it’s only going to get more congested in the coming months and years. Traditionally, traffic flows on the internet have largely been download-centric and networks have been built out to support those flows. However, the gravity of data and compute has shifted from the core to the edge as a result of technologies like the Internet of Everything (IoE), artificial intelligence and machine learning, cloud gaming and HD streaming and virtual reality. There is much more content and data that is now being created, stored and processed at the edge. This dynamic is creating huge network and traffic bottlenecks.

As a result, the future of our digital economy depends on ddge data centers, which alleviate these bottlenecks by enabling the multi-directional traffic flows that must occur through the peering and smart routing of traffic at the edge. This, in turn, will deliver successful outcomes to businesses and end-users. What are these successful outcomes, you might ask?

In this article, we answer this question by examining the top five trends through which we believe edge data centers will help determine our online future for years to come.

1. The Need for Even More Speed

Current network architectures were not built for the localized compute and distributed insights on which current and future connected devices and their latency sensitive applications will depend. As Chetan Sharma Consulting recently noted in their report titled, Edge Internet Economy: The Multi-Trillion Dollar Ecosystem Opportunity, “in order to serve the data, computing and communications demand of objects, sensors and people, resources, compute and intelligence has to move to the edge.” Although core networks and centralized cloud architectures already support the use of smart devices to an extent, they will be unable to handle the vast amounts of data that will be created at the edge by the over 75 billion connected devices predicted to be active by 2025. After all, smart devices and other real-time cloud-based collaboration/communication applications frequently require low latencies that traditional core network architectures and legacy data centers cannot provide.

Highly localized and proximate Edge data centers thus alleviate network bottlenecks, reducing latency and improving performance. They do this by enabling peering at the edge and acting as local gateways for high speed connections not only to the core, but also to other edges. In this way, the edge and the core can work together to allow companies to optimize traffic flows and choose for themselves where data computing should occur, according to latency, cost and performance requirements. As a result, the edge and the core are interoperable. Together, they satisfy enterprise needs for high-speed connectivity that can support the use of latency-sensitive applications. The intersecting capabilities of the edge and the core will combine to help rearchitect the internet in favor of enterprises now and into the future.

2. The Cloud Grows at the Edge

It’s no secret that a growing number of businesses are interested in using multiple clouds for different workloads – decisions that are based not only on cost, but also on the capabilities and complimentary services of each cloud provider. Yet with traditional network architectures, where enterprises must come to the cloud, these benefits are often muted. To get to the cloud, businesses typically must pay for network transit to Tier III data centers in Tier 1 and 2 markets or directly to cloud provider server farms potentially hundreds of miles away. Worse still, when attempting to directly connect themselves to the cloud, most businesses run into failed or delayed migrations, cost overruns and vendor-lock due to their inherent lack of in-house cloud expertise. Edge data centers can help enterprises overcome these challenges by partnering with cloud enablers, like Rackspace for example, to make it faster and more economical to connect to the cloud locally.

For many businesses, being able to leverage hybrid and multi-cloud strategies is a key goal. We examine how to do this right in our latest EdgeBook: Empower Your Edge – For Cloud.  In this regard, hyperconnected Edge data centers are critical in bringing a company’s multi-cloud strategy to fruition in an affordable manner. By enabling high-speed, low latency, local onramps to cloud providers and enablers, they bring the cloud to the customer. In turn, this proximate cloud enables the use of computationally intensive applications at the edge on all types of devices. At EdgeConneX, we recognize that the edge is where the customer needs it to be and is not necessarily bound to a specific location. As a result, it only makes sense that access to the cloud should also be brought locally to end-users. You can read how we have done this in Detroit, enabling the expansion of the Sprint Curiosity™ IoT platform into our data center through our partnership with Packet, a bare metal cloud service provider.

3. Gaming is Cloudified

Internet gaming is already a favorite pastime for individuals and gaming communities around the world. Cisco recently predicted that, by 2022, internet gaming traffic will have grown nine-fold since 2017. Put another way, internet gaming will account for 4% of the world’s total IP traffic by 2022. Now, however, from Microsoft’s Project xCloud and Google’s Stadia to NVIDIA’s GeForce NOW, the world is quickly embracing cloud gaming. In fact, the industry is projected to show 10X growth over the next few years.

What is cloud gaming? It is best to conceive of it as another move away from the download-centric paradigm legacy network architectures were built to support. To define cloud gaming simply, instead of downloading all of a game’s files onto their devices, end-users will essentially receive a stream of their interactions with game files running on remote servers in a data center.

That said, all games need to provide gamers with high quality experiences. For cloud gaming, proximity and connectivity are key. On one hand, to get their games to consumers, game developers need access to native and virtual cloud onramps, global CDNs, the largest MSOs, global IP providers, dark fiber and lit network service providers. On the other hand, they also need all of this within data centers that are proximate to the gamers they serve. An edge data center provider with a widespread, distributed footprint can solve for both these needs. In other words, edge facilities can provide the low latency experiences gamers need while fulfilling the surging connectivity demands of developers.

4. Artificial Intelligence & Machine Learning Go Mainstream

If cloud gaming emphasizes connectivity, rising corporate investments in Artificial Intelligence (AI) and Machine Learning really shine a light on a growing global need for high power density data center solutions. Legacy data centers were built for low density applications that required in the range of 5 to 10kW of power a rack. On the other hand, high performance computing (HPC) applications involving AI may require much higher power densities of 30 to 35 kW per rack.

Edge data centers are a perfect vehicle for supporting AI/ML workloads because they enable high levels of computing power on smaller physical footprints. Thanks to technological advances in computing equipment, businesses can use high density deployments to leverage new, more energy efficient processors and servers that individually require less power than previous models. In this way, they can do more in less space. With AI and Deep Learning applications only gaining importance across more and more industries and verticals, business demand for the high-density power available in proximate edge data centers will only grow.

5. The Streaming Wars Heat Up

If there is one trend that will continue to have a significant impact on the future of edge computing, it is streaming media. Our internet first began to change dramatically when vloggers started uploading content to YouTube in 2005 and Netflix started offering streaming services in 2007. For years now, content providers have been leveraging content distribution networks (CDNs) to cache data reasonably close to end-users, thereby improving their access to a host of online content from streaming media to downloadable objects and social media sites. Now, thanks to consumer demand for everything from 4K to Virtual/Augmented Reality (VR/AR) content, content providers need to further optimize delivery and distribution of content in order to stay competitive.

Globally, Cisco forecasts that by 2022, video traffic will account for 82% of all business and consumer IP traffic. Simultaneously, global VR/AR traffic will grow twelve-fold between 2017 and 2022. With newer platforms like HBO Max and Disney+ coming into their own, surging consumer demand for content means that it is once again time to rearchitect the internet to mitigate current bottlenecks. This is where Edge data centers have a vital role in creating a type of enhanced brick and mortar CDN. By pushing both the content and intelligence behind streaming services closer to consumers, edge computing and data centers will permit content creators to deliver their products faster and more reliably than ever before, opening up new application possibilities in the process. The edge can help satisfy our need to access content whenever and wherever we want it – at home, in the office or on the go.

Our Key Takeaway

In short, our future depends on Edge data centers. In many ways, the edge is necessary for the IoE, where intelligent devices and a myriad of sensors require networks without bottlenecks for high speed compute and communications activities. However, without edge data centers, achieving this leap will prove difficult. Under the legacy approach of centralized data centers and core networks, too many challenges around capacity, lower latency and cost exist. edge data centers can solve for these issues by moving computing and data storage closer to the end-user, thus enabling higher capacity, lower latency and reduced expenditures.

In other words, edge data centers can help us rearchitect the internet in a way that will support the flood of data and massive traffic flows generated by emerging technologies like AI, cloud gaming, VR/AR and multi-cloud deployments.  It isn’t a stretch to say that edge facilities, and the networks that they connect, are as vital to the future of our digital economy as roads, railways and factories were to previous generations. For enterprises to fully tap the potential of the IoE, it is safe to say this phrase: “The world needs Edge Data Centers.”

Phillip Marangella is CMO for EdgeConneX.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Tackling Utility Project Challenges with Fiberglass Conduit Elbows

Explore how fiberglass conduit elbows tackle utility project challenges like high costs, complex installations, and cable damage. Discover the benefits of durable, cost-efficient...

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

sdf_qwe/Shutterstock.com
Source: sdf_qwe/Shutterstock.com

Five Compelling Reasons to Consider Natural Gas for Data Center Projects

Phil Fischer, client executive for Black & Veatch, explains why new-build data centers are seriously considering natural gas for self-generation of the entire complex or for backup...

White Papers

Dcf Vertiv Wp Cover 2022 08 15 13 47 38

Vertiv’s Approach to Environmental, Social and Governance Matters

Aug. 16, 2022
Vertiv presents their approach to environmental, social, and corporate governance (ESG) including what they’re doing for the planet, for their people, and for their neighbors....