Layers and Location of Edge Computing

Aug. 19, 2020
As edge computing takes shape, we are approaching the brink of a new phase for global Internet infrastructure. Edge computing is evolving in tiers, with opportunities in regional data hubs, cities, telecom towers and on devices. Think of it in terms of “edge, edgier and edgiest.” 

The next entry in a  new special report from Data Center Frontier explores how edge computing is taking shape, focusing on where connectivity and coverage is growing at a rapid rate. 

Get the full report.

As edge computing takes shape, we are approaching the brink of a new phase for global Internet infrastructure. Edge computing is evolving in tiers, with opportunities in regional data hubs, cities, telecom towers and on devices. Think of it in terms of “edge, edgier and edgiest.”

Key trends include the densification of large cities, and enhancing connectivity and coverage in second-tier markets. Says one C-suite executive: “What we’re actually doing is building data centers at aggregation points. We look at locations based on population density, access to cell towers and cable head-ends.”

Here’s a look at the layers of infrastructure and their chief characteristics:

Micro Data Centers in Major Cities

Supporting 5G networks and other edge technologies will require denser wireless infrastructure. Early movers in edge computing are looking to deploy small clusters of modular data centers in primary data center markets like Chicago, Atlanta and Dallas, and connect to colocation facilities in the major telco hubs (carrier hotels) in these markets.

Data Centers in Regional Markets 

A growing number of data center providers have targeted “second-tier” cities that have active business communities but are outside the “Big Six” primary data center markets (Northern Virginia, Silicon Valley, Greater New York/New Jersey, Dallas, Chicago and Phoenix).

Micro Data Centers at Telecom Towers 

Many edge computing models see tower sites as key points to connect end-user devices to the core network. Some edge strategies include the deployment of data storage and compute capacity at tower sites, which will require enclosures that reside at the base of the tower.

On-Site IT Enclosures and Appliances 

The edge network will need to extend to office campuses, factories, warehouses, hospitals and logistic centers to support data collection from IoT devices and sensors. Some analysts refer to this as the “fog” layer. These installations will likely feature IT cabinets or server appliances, and this is a key area of focus for cloud computing providers.

Street Furniture and Lighting 

You may not think of street lights as edge computing infrastructure. But “smart street lights” are a key enabler of Smart Cities strategies, and can include built-in wireless connectivity (Bluetooth and Wi-Fi), high-definition digital cameras, and sensors to monitor weather and air quality. Most smart street lights come with a control network that can connect a city-wide array of sensors and analytics packages. 5G will also boost demand for low-power antennas, known as small cells and DAS, which can be mounted on utility poles, buildings and street furniture.

As more data is generated by edge devices and applications, distributed edge computing capacity will also perform “data thinning” – running an initial round of analysis before sending business-critical datasets across the network.

End-User Devices

This includes everything from smartphones to smart speakers (“Alexa, tell me a joke”) to drones and autonomous cars. Some of these devices will have the on-board horsepower to run AI and other compute-intensive applications, while others will operate as terminals that send data to the core or cloud.

Read more: Edge Computing and Why It Matters: In 2020 and Beyond.

A key concept is to use edge infrastructure to drive a fundamental shift in Internet architecture, moving network interconnection points – the key intersections that allow data to move between networks – from the core of the Internet to its outer perimeter.

Interconnection has historically been focused in the largest carrier hotel buildings in major business markets. This can create longer routes for data traffic, creating latency and performance issues that will grow as workloads move further from central hubs. A distributed network of edge interconnection centers would transform the performance of the Internet, creating local connections that will dramatically reduce latency.

In practice, this means data will be processed at the edge and won’t have to travel all the way to the core of the network. Since network capacity is expensive, a distributed network will need to prioritize the type and value of data.

As more data is generated by edge devices and applications, distributed edge computing capacity will also perform “data thinning” – running an initial round of analysis before sending business-critical datasets across the network.

Catch up on the first article in the series here.

In the following weeks, this new special report series will explore the following topics, as well:

  • Edge Computing Business Cases
  • Three Considerations when Deploying Edge Computing

Download the full report, courtesy of Chatsworth Products, “Edge Computing: A New Architecture for a Hyperconnected World,” that explores the possibilities of the edge data center and how edge computing is changing the colocation and data landscape of today.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

SeventyFour / Shutterstock.com

Improve Data Center Efficiency with Advanced Monitoring and Calculated Points

Max Hamner, Research and Development Engineer at Modius, explains how using calculated points adds up to a superior experience for the DCIM user.

White Papers

Dcf Prolift Wp Cover 2021 06 07 12 34 02 212x300

The Secret to Building Data Centers Faster is Project Buffering

Aug. 8, 2022
To meet the needs of the rapidly expanding global colocation market, a significant amount of new infrastructure must be built quickly. Project buffering can eliminate many of ...