How AWS Cloud Customers Are Using Local Zones for Edge Computing

Sept. 4, 2020
Amazon Web Services is adding a second Local Zone edge node in Los Angeles, providing single-digit millisecond latency to customers. AWS says users are tapping Local Zones to run hybrid environments and support latency-intensive tasks like game rendering.

For the world’s largest cloud computing players, the edge is taking shape around big cities and specific use cases. As it extends its Internet architecture beyond its massive server farms, Amazon Web Services is focused on Los Angeles and its concentration of film and gaming companies.

AWS is now adding a second Local Zone in Los Angeles, providing single-digit millisecond latency to AWS customers. Last year LA was the site of the first AWS Local Zone, which features Amazon’s cloud computing infrastructure deployed at local colocation centers, rather than a huge Amazon data center campus. As it doubles down in LA, AWS is providing some insights into how its customers are using Local Zones.

Amazon operates a massive global network of data centers to power its cloud computing platform, with most of its capacity focused on clusters of large campuses in key network hubs like Northern Virginia. With Local Zones, AWS is creating a more distributed infrastructure to support edge computing and low-latency applications.

The building block for Amazon’s edge ambitions is AWS Outposts, which are racks filled with turn-key AWS cloud infrastructure. Outposts were introduced to allow enterprises deploy hybrid clouds in their on-premises data centers, but will also drive Amazon’s push into edge computing in Local Zones.

It’s not an accident that Los Angeles is the first market for the AWS edge rollout. The AWS Local Zone in Los Angeles targets the market for computer animation and rendering for games and movies. These applications require fast connections between data storage and compute, and the Local Zone allows developers to shift capacity off-premises to AWS and retain low-latency access.

The Los Angeles local zones can also be used to transform TV and film production, changing the way huge video files are managed and shared in the filing and editing process.

How AWS Customers Are Using Local Zones

“You can now design your applications to run in both Local Zones in LA to achieve high availability and even greater fault-tolerance,” AWS Senior Technical Evangelist Steve Roberts said in a blog post. “Both Local Zones in LA are interconnected with high-bandwidth, low-latency networking, over fully redundant, dedicated metro fiber providing high-throughput, low-latency networking between them.”

An Amazon Web Services Outposts rack, which will serve as the building block for extending the AWS cloud to provide edge computing at on-premise data centers, colocation facilities and telco networks. (Image: Amazon Web Services)

Roberts said customers who combine Local Zones with AWS Direct Connect are achieving sub-1.5ms latency communication between their AWS infrastructure and  applications in on-premises data centers in the LA metro area.

“These enterprises have workloads that currently run in existing on-premises data centers in the Los Angeles metro area and it can be daunting to migrate these application portfolios, many of them interdependent, to the cloud,” said Roberts, who added that Local Zones allow these customers to “establish a hybrid environment that provides ultra-low latency communication between applications running in the Los Angeles Local Zone and the on-premises installations without needing a potentially expensive revamp of their architecture. As time progresses, the on-premises applications can be migrated to the cloud in an incremental fashion, simplifying the overall migration process.”

Another use scenario for AWS edge is virtual desktops for rendering and animation workloads. “For these workloads, latency is critical and the addition of a second Local Zone for Los Angeles gives additional failover capability, without sacrificing latency, should the need arise,” Roberts writes.

Digital Delivery as an LA-Centric Opportunity?

AWS doesn’t offer details on its Local Zone customers. But it’s no secret that Netflix is a major AWS customer.

“Netflix uses Amazon Web Services (AWS) for nearly all its computing and storage needs, including databases, analytics, recommendation engines, video transcoding, and more—hundreds of functions that in total use more than 100,000 server instances,” AWS says in a case study.

We recently reported on Netflix’ interest in using edge computing to bring new efficiencies to TV and film production, changing the way huge video files are managed and shared. Like many production companies, Netflix often winds up using courier services to move digital files, including “dailies” – the unedited raw footage shot during the making of a motion picture or TV series. The video is shot in high resolution, creating large files that are difficult to move across a network. Footage is often transported on storage tape by specialized courier services or even FedEx, with a significant volume of that activity focused on Los Angeles and the Hollywood entertainment industry.

Edge computing can perform “data thinning” to distill large datasets down to smaller files to be sent across the network for review. In TV and film production, that means transcoding, which converts large files into a format more suitable for digital transport. Edge computing can bring processing power onto remote sets, allowing transcoding to take place on location.

“We are shooting content at 8K, and that’s about 12.5 gigabytes a minute compressed,” said Dave Temkin, Vice President of Networks at Netflix, in a January presentation at PTC. “The ability to transcode something quickly on location, to go from 8K down to a daily – which doesn’t even need to be HD – that  you can get off the set or off location very quickly, that’s really important to us.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Gorodenkoff/Shutterstock.com
Source: Gorodenkoff/Shutterstock.com

Transforming Data Center Management with Centralized DCIM Insights

Ray Daugherty of Modius explains why centralized data management is vital for robust data center governance.

White Papers

Get the full report.

Reimagine Enterprise Data Center Design and Operations

April 27, 2022
Future Facilities explores how digital twin technology can be used to virtualize and fine tune data center design.