• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Special Reports / How to Leverage True Edge Flexibility and Overcome Operational Challenges

How to Leverage True Edge Flexibility and Overcome Operational Challenges

By Bill Kleyman - November 5, 2018 Leave a Comment

How to Leverage True Edge Flexibility and Overcome Operational Challenges

Autonomous vehicles will impact the edge. (Source: Ford Motor Co.)

LinkedinTwitterFacebookSubscribe
Mail

This is the third entry in a four-part Data Center Frontier special report series that explores edge computing from a data center perspective. This post covers the potential and benefits of true edge flexibility and how to tackle the challenges involved. 

edge flexibility

Download the full report.

Many organizations are now looking for better ways to deliver rich content to users who are heavily spread out. We’re seeing even more companies push out applications, desktops, and various services to rural locations. A big challenge here revolves around performance and user experience. After all, just because an application can be delivered doesn’t mean that’s happening efficiently.

This isn’t limited to rural locations only. Edge can live in urban environments where network resources are constrained or slow (i.e. where number of peering hops are too frequent). For example, content providers from Boston needing to peer into New York City to get to a major hop point.

Either way, this was the ultimate challenge when it came to cloud computing. As we see more data and services impact traditional cloud systems, we saw real inefficiencies in trying to stream and work with all of this data from a cloud ecosystem. Organizations needed a better way to process this data.

As more users connect to the cloud and request data heavy in content and size — utilizing the edge for fast delivery will make complete sense. Gartner recently stated that emerging technologies require revolutionizing the enabling foundations that provide the volume of data needed, advanced compute power, and ubiquity-enabling ecosystems. The shift from compartmentalized technical infrastructure to ecosystem-enabling platforms is laying the foundations for entirely new business models that are forming the bridge between humans and technology.

“When we view these themes together, we can see how the human-centric enabling technologies within transparently immersive experiences — such as smart workspace, connected home, augmented reality, virtual reality and the growing brain-computer interface — are becoming the edge technologies that are pulling the other trends along the Hype Cycle,” said Mike J. Walker, research director at Gartner.

Edge flexibility, design, and overcoming challenges

The entire concept of edge is to be able to impact users as well as services based on proximity. So, edge computing offers tremendous benefits in terms of how you deploy edge solutions and manage data.

Today, you can deliver modular edge data center infrastructure solutions which provide standardized deployment options. This gives you the flexibility and capability to meet the demands of compute today and beyond.

From the customer’s perspective, edge computing can be any services or architecture which helps you simplify and localize the delivery of applications, data sets, and services. These services help you gain more control over your WAN, bandwidth requirements, and how rich content is delivered. The future absolutely looks to be a lot more interconnected with more user distribution. And, with the influx of new data, edge will be even more important.

This means that edge design is flexible, and specifically caters to high-performance or even latency-sensitive applications. The really cool part here is that you can control how data flows throughout your entire edge ecosystem, secure the processing of that data, and still positively impact the user experience.

That said, deploying edge can have its challenges. Remember, edge solutions aren’t just ‘another data center site.’ They’re smaller, use-case specific, and are designed to be dense environments to help you process more services and user data. With that, there are three challenges to be aware of when working with edge design:

  1. Use-case definition: This is actually a major stopping point for edge projects. There may be a great idea or concept but defining the use-case reaches a barrier. This usually happens when there’s misalignment between IT, business requirements, and management. In these situations, it’s important to take a step back and look at the long-term strategy of your own organization. Are you growing? Will you be supporting remote users? Are you trying to deliver new types of connected services? If you see that edge is a fit, take the next steps to write up a good business plan and technology strategy to support it. You don’t have to be an edge expert to clearly define your own use case. Furthermore, there are great providers who can help you on this journey. However, it’s important to align infrastructure and business to ensure that your strategy can take off. From there, it’s key to work with the right people who can bring that vision to life. Which brings us to the next point.
  2. Lack of expertise: If you’ve tried to deploy edge solutions in the past but find yourself on a support island, you’re not alone. Over a fairly recent time frame, we really did have serious lack of expertise when it came to deploying edge solutions. Plus, these aren’t just inexpensive projects where you can just ‘wing it.’ So, even if an organization is able to define a usecase, they might get stuck when it came to working with good partners who could help them implement the vision. Again, edge is not like a typical data center. There are different considerations around space, density, power, management, connectivity, redundancy, and much more. This is why working with the right people can make the entire process so much easier. The good news is that today there are great organizations, partners, and data center providers which are ready and able to help with edge solutions. Don’t let this be a stopping point, work with partners which can help you scale and build out your own edge solution.
  3. Concerns around data management: This is a big one which adds a key complication into deploying edge. Basically, ‘what happens to my data?’ You’re going to have to take some extra time to define your data requirements and management policies. Is the data transient or will it be stored at the edge? What is the data that’s being processed? What is the connectivity control method around the data? Again, all of this will need to be defined and integrated into your own edge solution. That said, you can absolutely still build in compliance and regulation into an edge architecture. However, you’ll need to take extra precautions to ensure data security and control. Take into consideration the location of the edge, storage systems at the edge, how the data will be process, and who will have access. The cool part is that software-defined solutions allow you to integrate with core data center systems and support powerful data locality policies. This can really impact industries like pharma, healthcare, and other regulated organizations.

Understanding and managing the latency budget

To an end user, latency is the reason that downloading a movie “takes so long”, but to a content provider the number of milliseconds it takes to complete a function can be measured in customer dissatisfaction and cost.

Furthermore, to a business, latency can also mean the loss of business or a competitive edge.

Even at the speed of light the round trip from a central data center, a facility located in a Tier I market for example, can mean the accumulation of transmission costs. A study conducted by ACG Research estimated that caching content locally in a metro population can save approximately $110 million over a five-year period. If we were to apply this same logic to a company running an IIoT parts tracking application, the hard costs of transmission could be measured, but the associated cost in the degradation of the performance of the application would be incalculable.

It’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed.

Consider this as an example — using supply chain in physical systems, like Walmart or even Amazon Prime Same Day Delivery. In any supply chain system, as traffic increases, transportation costs go up. As a result, distribution gets closer to end users to decrease transport costs and increase throughput. The same concept can be applied to edge and data delivery.

frontier spotlight june 2018

Edge computing will take data everywhere, including the floor of the ocean, as is the case for Microsoft’s Project Natick deployment in Scotland. (Photo by Scott Eklund/Red Box Pictures for Microsoft).

With the increase of traffic moving through the edge, there is a greater demand for more bandwidth and less latency. As discussed earlier, it’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed. Where data may have not fluctuated too much in the past, current demands are much different.

  • Bandwidth Bursts. Many providers now offer something known as bandwidth bursts specifically for edge solutions. This allows the administrator to temporarily increase the amount of bandwidth available to the environment based on immediate demand. This is useful for seasonal or highly cyclical industries. There will come a time when for a period of business operation, more bandwidth is required to help deliver the data. In those cases, look for partners who can dynamically increase that amount and then de-provision those resources when they are no longer being used.
  • Network Testing. Always test your network and the network of the edge provider. Examine their internal speeds and see how your data will act on that network. This also means taking a look at the various ISP and connectivity providers being offered by the colocation provider. Many times, a poor networking infrastructure won’t be able to handle a large organization’s ‘Big Data’ needs despite potentially having a fast Internet connection. Without good QoS and ISP segmentation, some edge data centers can actually become saturated. Look for partners with good, established connections providing guaranteed speeds.
  • Know Your Applications. One of the best ways to gauge edge data requirements is to know and understand the underlying application or workload. Deployment best practices dictate that there must be a clear understanding of how an application functions, the resources it requires and how well it operates on a given platform. By designing the needs around the application, there is less chance that improper resources are assigned to that workload.

There are a lot of benefits and use-cases around edge and connected systems. Take the time to think about your own strategies and whether your current infrastructure is capable of supporting these initiatives.

This Data Center Frontier series, focused on edge computing, will also cover the following topics over the coming weeks:

  • New Ways to Deploy Edge Capacity for Data Center Leaders
  • Understanding the Edge and the World of ‘Connected Devices’
  • Deploying Real-World Edge Solutions: A Lego-Inspired Design Approach

Download the full Data Center Frontier Special Report on Edge Computing, courtesy of BASELAYER. 

Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: BASELAYER, Data Center Infrastructure, Data Center Network, edge computing special report

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.
bill@kleyman.org'

About Bill Kleyman

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. Currently, Bill works as the Executive Vice President of Digital Solutions at Switch.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

5 Ways to Mitigate Supply Chain Unpredictability and Labor Shortages in Data Center Construction

5 Ways to Mitigate Supply Chain Unpredictability and Labor Shortages in Data Center Construction Blake Weaver, Data Center Specialist at ProLift Rigging offers a list of ways to overcome supply chain challenges and labor shortages in data center construction. 

DCF Spotlight

The COVID-19 Crisis and the Data Center Industry

The COVID-19 pandemic presents strategic challenges for the data center and cloud computing sectors. Data Center Frontier provides a one-stop resource for the latest news and analysis for decision-makers navigating this complex new landscape.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

White Papers

busway

Busway: Safety and Reliability

Since its introduction within the automotive industry in the 1930’s through to its current widespread use in data centers, busway offers a high density, flexible power distribution solution for many applications. Get the new report from Anord Mardix that explores a fresh approach — integral coupling — which utilizes a unique male and female coupling system that is integral to the individual busway sections.

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Peter Kazella and Associates, Inc

  • Account Executive - Data Center Solutions - Atlanta, GA
  • Mechanical Construction Sales Rep - Ashburn, VA
  • Account Executive - Data Center Solutions - Denver, CO
  • Data Center QA / QC Manager - Omaha, NE
  • Electrical Commissioning Engineer - Papillion, NE

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • White Paper

Copyright Data Center Frontier LLC © 2021