• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Special Reports / From Data Center to Cloud and Back: Cloud Repatriation & the Edge

From Data Center to Cloud and Back: Cloud Repatriation & the Edge

By Doug Mohney - November 24, 2021

From Data Center to Cloud and Back: Cloud Repatriation & the Edge

The network meet-me room inside the NTT Global Data Centers Americas facility in Hillsboro, Oregon. (Photo: NTT)

LinkedinTwitterFacebookSubscribe
Mail

Last week we launched a special report series on the hybrid cloud. This week, we’ll look at how cloud repatriation and edge computing are impacting enterprise infrastructure needs and requirements.

hybrid cloud

Get the full report

Over decades of IT practices, there have been many swings of the pendulum between centralized and distributed resources. Lower costs of computing shifted workloads from a centralized approach to more distributed resources while the availability of increased bandwidth enabled more a distributed workforce with centralized resources necessary for software licensing, data security, and today’s work- from-home/work-from-anywhere (WFH/WFA) world.

Enterprises are now faced with two different trends impacting infrastructure needs and requirements. Cloud repatriation, bringing back workloads from public clouds into enterprise-controlled resources, is a growing issue as organizations assess underlying issues of cost, security, availability, and in-house skills. Edge computing continues to be an evolving concept with the basic principle of moving compute power closer to where the data is for faster responsiveness.

Cloud Repatriation: Bringing Workloads Home

While “Cloud First” was a good mission slogan, many companies who have reviewed their public cloud usage have seen IT costs increase, performance drop, and/or compliance issues arise. CIOs and IT leaders are examining public cloud investments to see if they have delivered lower costs and a solid return on investment, especially in evaluating surge purchase of cloud services in response to the COVID-19 pandemic.

Business applications moved to a public cloud environment may not perform at scale, especially when the economics of storage requirements come into play.

In 2019, IDC predicted up to 50 percent of public cloud workloads would be moved to on-premises infrastructure or private cloud due to security, performance, and cost reasons, while a 451 Research report stated 20 percent of companies it surveyed said cost drove them to move workloads from public clouds to private ones. Business applications moved to a public cloud environment may not perform at scale, especially when the economics of storage requirements come into play.

Dropbox is a prominent example of an early cloud repatriation. The company started in the cloud using Amazon Web Services and shifted to its own data centers in 2015, relocating nearly 600 petabytes of its customer data to its own network of data centers. Today the company has precision control of its IT costs as it continues to grow, taking advantage of the latest technologies to cost-effectively increase storage density at a pace it controls.

Repatriation can deliver substantial benefits, providing an IT managed infrastructure stack that will scale with more predictable costs. Infrastructure can be optimized for performance based on workload application and user needs. In addition, workloads may need to be moved back to private clouds for security and compliance requirements.

High-end colocation provides a “clean slate” approach to repatriated workloads. Existing data center operations can continue without the potential for disruption while colocation provides the ability to scale physical infrastructure on an as-needed basis, especially if hyperscaling is anticipated.

Edge Computing: Different Definitions, Different Requirements

Today’s IT environment includes more tools than ever before, with edge computing the latest option to improve performance. Essential compute functions requiring rapid processing are performed closer to the end-user, typically in cases where results are time-critical, milliseconds of delay may affect user experience or processing, or in cases where there’s a large amount of data with a lot of non-essential information and it makes more sense to filter out what’s needed and send it along than to consume bandwidth.

Whatever the scenario, edge computing today is a tailored solution for a specific requirement. Edge doesn’t replace existing cloud and data center resources but off-loads them, rapidly processing data close to the source with filtered and analyzed information flowing back to the cloud and data center for storage, backup, and analysis in aggregate. Updates and management of edge resources will always take place through IT and the data center, since the data center, clouds, and edge complement each other.

There’s no one single model as to how edge computing should be implemented. Cell phone operators may implement edge computing simply by placing a couple of racks of servers in a controlled environment shelter at the base of a tower. Verizon has teamed with Microsoft Azure to place edge computing services directly at the customer site to enable low latency high-bandwidth applications involving computer vision, augmented and virtual reality, and machine learning.

The Microsoft Project Natick module takes edge compouting to the bottom of the ocean. (Image: Microsoft)

The Microsoft Project Natick module takes edge compouting to the bottom of the ocean. (Image: Microsoft)

For greater compute requirements in remote locations, Microsoft has deployed shipping containers full of densely packed racks and demonstrated sealed undersea data centers designed to be “lights out, hands off” resources capable of operating for 5 years or more. Such deployments place computing resources in close physical proximity to users to provide low latency and high responsiveness for workloads in such areas as mining, oil and gas production, and military operations.

An edge computing solution doesn’t necessarily have to locate servers physically at the customer location but can take advantage of high-speed broadband and short distances between data inflow and compute capacity. Lumen, formerly CenturyLink, has invested several hundred million dollars to build over 60 edge compute locations worldwide to provide low latency computing resources for its customers, with fiber providing the connectivity between customers and the closest edge center. The carrier believes the “metro edge compute” design is ideal for supporting smart manufacturing, point-of-sale transactions, video analytics, retail robotics, and various IoT use cases.

Enterprises are utilizing the resources at high-end colocation facilities to create their own edge solutions, leveraging access to low-latency fiber connectivity through multiple carriers and via on premises meet-me facilities for gigabit-speed Ethernet access to carriers and cloud providers. In combination with bespoke server and storage solutions designed by the enterprise, a colocation facility can deliver an edge computing solution to a metro area or specific geographic region with the ability to be physically replicated as needed to other areas.

Download the full report Hybrid Cloud, courtesy of NTT to learn more about how workloads are continuing to shift between data center, cloud, and colocation. In our next article, we’ll look at the benefits and limits of data center, cloud, and collocated solutions. Catch up on the last article here. 

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Cloud, Edge Computing, Hyperscale data center, NTT

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Doug Mohney

Doug Mohney has been working in and writing about IT and satellite industries for over 20 years. His real world experience including stints at two start-ups, a commercial internet service provider that went public in 1997 for $150 million and a satellite internet broadband company that didn't.Follow Doug on Twitter at @DougonIPComm

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Equipment Longevity & Performance: Why the Bathtub Curve Is Inaccurate

Equipment Longevity & Performance: Why the Bathtub Curve Is Inaccurate Chad Peters, Director of Infrastructure Solutions for Service Express, revisits the Bathtub Curve theory and explains how to track equipment reliability and performance for data-driven buying decisions.

White Papers

Data Center Operations Redefined Through People, Processes and Technology

Data center operations are on the verge of a new normal, and operators have the chance to redefine the industry as it evolves to meet the changing needs of its customers. A new white paper by BCS presents strategic solutions for safeguarding mission critical infrastructure in data centers during uncertain times.

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

DCF Spotlight

Data center modules on display at the recent Edge Congress conference in Austin, Texas. (Photo: Rich Miller)

Edge Computing is Poised to Remake the Data Center Landscape

Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Pkaza Critical Facilities Recruiting

  • MEP Coordinator - Data Center Construction - Ashburn, VA
  • Data Center Facility Engineer - Chantilly, VA
  • Data Center Site Operations VP - Seattle, WA
  • Senior Electrical Engineer - Data Center - Denver, CO
  • Senior Estimator - Data Center Construction - Denver, CO

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Endeavor Business Media© 2022