How AWS Cloud Customers Are Using Local Zones for Edge Computing

Sept. 4, 2020
Amazon Web Services is adding a second Local Zone edge node in Los Angeles, providing single-digit millisecond latency to customers. AWS says users are tapping Local Zones to run hybrid environments and support latency-intensive tasks like game rendering.

For the world’s largest cloud computing players, the edge is taking shape around big cities and specific use cases. As it extends its Internet architecture beyond its massive server farms, Amazon Web Services is focused on Los Angeles and its concentration of film and gaming companies.

AWS is now adding a second Local Zone in Los Angeles, providing single-digit millisecond latency to AWS customers. Last year LA was the site of the first AWS Local Zone, which features Amazon’s cloud computing infrastructure deployed at local colocation centers, rather than a huge Amazon data center campus. As it doubles down in LA, AWS is providing some insights into how its customers are using Local Zones.

Amazon operates a massive global network of data centers to power its cloud computing platform, with most of its capacity focused on clusters of large campuses in key network hubs like Northern Virginia. With Local Zones, AWS is creating a more distributed infrastructure to support edge computing and low-latency applications.

The building block for Amazon’s edge ambitions is AWS Outposts, which are racks filled with turn-key AWS cloud infrastructure. Outposts were introduced to allow enterprises deploy hybrid clouds in their on-premises data centers, but will also drive Amazon’s push into edge computing in Local Zones.

It’s not an accident that Los Angeles is the first market for the AWS edge rollout. The AWS Local Zone in Los Angeles targets the market for computer animation and rendering for games and movies. These applications require fast connections between data storage and compute, and the Local Zone allows developers to shift capacity off-premises to AWS and retain low-latency access.

The Los Angeles local zones can also be used to transform TV and film production, changing the way huge video files are managed and shared in the filing and editing process.

How AWS Customers Are Using Local Zones

“You can now design your applications to run in both Local Zones in LA to achieve high availability and even greater fault-tolerance,” AWS Senior Technical Evangelist Steve Roberts said in a blog post. “Both Local Zones in LA are interconnected with high-bandwidth, low-latency networking, over fully redundant, dedicated metro fiber providing high-throughput, low-latency networking between them.”

An Amazon Web Services Outposts rack, which will serve as the building block for extending the AWS cloud to provide edge computing at on-premise data centers, colocation facilities and telco networks. (Image: Amazon Web Services)

Roberts said customers who combine Local Zones with AWS Direct Connect are achieving sub-1.5ms latency communication between their AWS infrastructure and  applications in on-premises data centers in the LA metro area.

“These enterprises have workloads that currently run in existing on-premises data centers in the Los Angeles metro area and it can be daunting to migrate these application portfolios, many of them interdependent, to the cloud,” said Roberts, who added that Local Zones allow these customers to “establish a hybrid environment that provides ultra-low latency communication between applications running in the Los Angeles Local Zone and the on-premises installations without needing a potentially expensive revamp of their architecture. As time progresses, the on-premises applications can be migrated to the cloud in an incremental fashion, simplifying the overall migration process.”

Another use scenario for AWS edge is virtual desktops for rendering and animation workloads. “For these workloads, latency is critical and the addition of a second Local Zone for Los Angeles gives additional failover capability, without sacrificing latency, should the need arise,” Roberts writes.

Digital Delivery as an LA-Centric Opportunity?

AWS doesn’t offer details on its Local Zone customers. But it’s no secret that Netflix is a major AWS customer.

“Netflix uses Amazon Web Services (AWS) for nearly all its computing and storage needs, including databases, analytics, recommendation engines, video transcoding, and more—hundreds of functions that in total use more than 100,000 server instances,” AWS says in a case study.

We recently reported on Netflix’ interest in using edge computing to bring new efficiencies to TV and film production, changing the way huge video files are managed and shared. Like many production companies, Netflix often winds up using courier services to move digital files, including “dailies” – the unedited raw footage shot during the making of a motion picture or TV series. The video is shot in high resolution, creating large files that are difficult to move across a network. Footage is often transported on storage tape by specialized courier services or even FedEx, with a significant volume of that activity focused on Los Angeles and the Hollywood entertainment industry.

Edge computing can perform “data thinning” to distill large datasets down to smaller files to be sent across the network for review. In TV and film production, that means transcoding, which converts large files into a format more suitable for digital transport. Edge computing can bring processing power onto remote sets, allowing transcoding to take place on location.

“We are shooting content at 8K, and that’s about 12.5 gigabytes a minute compressed,” said Dave Temkin, Vice President of Networks at Netflix, in a January presentation at PTC. “The ability to transcode something quickly on location, to go from 8K down to a daily – which doesn’t even need to be HD – that  you can get off the set or off location very quickly, that’s really important to us.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Image courtesy of EXFO

Navigating the Future: Upgrading Networks in Data Centers for 400G  

Nicholas Cole, Data Center Solution Manager at EXFO, explains why the journey towards 400G and beyond is not merely about keeping pace but also ensuring that every step forward...

White Papers

DCF media kit 2022

Data Center Frontier Media Kit

Oct. 16, 2021
Data Center Frontier is ideal for companies that want to be seen as a thought leader in the data center industry. The programs include opportunities to build awareness, submit...