IoT and Latency Issues Will Guide Edge Deployments

June 21, 2018
Our Data Center Executive Roundtable gazes into the future shape of edge computing, examining how IoT and latency issues will guide edge deployments:

Today we conclude our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry. In today’s discussion, our panel of experienced data center executives – Randy Rowland of Cyxtera, Dana Adams of Iron Mountain, Joel Stone of RagingWire, Samir Shah of BASELAYER, and Eric Ballard of Stream Data Centers – discuss how the rise of the Internet of Things and edge computing will impact data center infrastructure.

The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.

Data Center Frontier: All our “Things” are getting smarter and more powerful, bringing the computing power of the cloud into devices in our pockets. As more workloads begin to move to the edge of the network, how will it change how – and where – we deploy infrastructure?

Samir Shah, VP of Product Management, BaseLayer

Samir Shah, BASELAYER: As workloads move closer to the edge, data center capacity will be needed anywhere users connect with wireless signal sources. In today’s 4G LTE networks, round trip latency ranges between 60-70ms. In the post 5G world, the only way to achieve “1G to your device” is to reduce the latency to less than 8ms. Physical limitations will result in data center resources being colocated at signal sources to eliminate delays caused by peering hops.

Another way to look at this issue is to compare internet transport traffic to a supply chain distribution system (i.e. Amazon One Day shipping). In any supply chain model, as transit time is reduced, the system realizes improved throughput. In the 5G use case, achieving a 1-8ms transit time increases throughput from 18-21 Mb/s in a 4G network up to 1,288 Mb/s.

In terms of how these deployments will look, traditional data center building blocks (UPS, batteries, generators, etc.) will not translate “as-is” to edge environments. Manufactures and technology providers will have to take a hard look at their factors, features, and flexibility in the early adoption phase of edge deployments.

Eric Ballard, Vice President, Network & Cloud for Stream Data Centers,

Eric Ballard, Stream Data Centers:  For the last 20 years, the vast majority of data center consumption has been in aggregated markets around the globe, creating major data center hubs like Ashburn, New York, Dallas, Chicago, Silicon Valley, etc. in the USA and locations like London, Amsterdam, Frankfurt, Singapore, Hong Kong, Tokyo, etc. in other parts of the world. The exceptions to this were where enterprises built their own data center in locations that made geographic sense to them or were at or near their headquarters.

The major data center hubs have continued to grow and expand and have shown no signs of slowing down in their growth, but as we create more and more data and content and latency requirements evolve based on applications such as autonomous vehicles, augmented reality and others this data will need to live closer to the consumer and devices. We have seen a lot of movement in the edge data center market in the past couple of years, and will continue to see innovation and a reimagination of what the edge is.

Over the next year or two you will see the edge more into smaller cities and towns than what it is today, but after that it will start to move toward the true edge which is the neighborhood, the base of cell phone towers, or office buildings. These facilities will be smaller than many of the data centers we know today, and there will be additional evolutions around automation in these facilities. Today we are used to having a team of data center engineers inside of building, but at the edge you will have a team that will take care of a large number of facilities spread across a larger geography. We as an industry will continue to learn about how to remotely manage and maintain facilities and will continue to innovate and to develop new technologies to assist the teams.

JOEL STONE, RagingWIre

Joel Stone, RagingWire Data Centers: The methods, the locations, and the technologies we use to deploy infrastructure are changing.

For example, the advent of 5G which focuses on mobile broadband, low-latency communications, and massive machine technologies such as autonomous vehicles, are pushing us to change our paradigm for edge, fog, and cloud computing. Everyone is impacted, from data center providers focusing on building a high degree of network and interconnection systems, down to the basic software architectures that will allow for more resilient consumer applications that sit on devices like your car, TV, or home thermostat.

More specifically, edge computing in its most raw form exists in the palm of your hand. True edge computing is pushing the computational load down to the ubiquitous smart phone or tablet devices; because the edge is where the internet meets the physical world.

This same model holds true for the Internet of Things. Our “Things” are the edge – they are the devices that touch and interact with the physical world. In the data center industry, this may be as simple as a device that measures temperature and humidity changes, then meshes with other devices to decide how to normalize or stabilize the environment. Semi-autonomous networks and systems have been programmatically or procedurally doing this for quite some time. Machine learning makes them more efficient but may require more compute horsepower than our individual “Things” currently possess.

In the data center industry, we’re looking at a fundamental change in the way people design and implement their applications, network, and compute environments. We see a higher degree of focus on resilient, self-healing, and interconnected, low-latency communication out to these edge devices, and we’re seeing a growth in the amount of compute that exists outside our “four-walls.”

At the same time, we’re seeing a shift in the amount of analytical data processing, or “big data” being converted to “small data”… that is, data that may be collected from a large mesh of these edge systems, analyzed and made useful to an individual.

And there’s your paradigm shift: the shift to collecting “big data” from a ton of end point devices, and making it hyper-useful to an individual person, or to a single node on the system. Consider the GPS system in your car making a decision to re-route you based on traffic data from thousands of other cars and handsets – in real time. Now imagine if you were a first responder, how critical is the infrastructure that allows you to receive these updates?

The shift in infrastructure is partially driven by the required resiliency of these systems. Some of that infrastructure will necessarily be installed logically, and in some cases physically, closer to the edge devices and with a higher degree of interconnectedness.

Randy Rowland, President of Data Center Services at Cyxtera

Randy Rowland, Cyxtera: The demand for workloads to be as close to the edge of the network is increasing capacity demand of the core data center as well as for storage and data analytics. While the change may be where the edge is now – with users on their devices – but the data center resources are still needed to handle data wherever it may be generated. And this data still needs to be routed back to core facilities in many cases.

CDN, content and private cloud providers are coping with the change in edge computing by deploying instances in more data centers across a broader geography to place services closer to the end users. It helps to reduce latency and long haul network traffic.

Dana Adams, Vice President and GM of Data Centers, Iron Mountain

Dana Adams, Iron Mountain: We are very excited about the edge data center market and have spent a lot of time really trying to understand where this space is headed. While demand for cloud and core data centers will continue to grow, we also see big potential for edge data centers that will be driven by a range of use cases, including content delivery, IoT, virtual reality, AI and the evolution of 5G to name a few.

We also think that modular or containerized solutions will make the most sense as far as providing the infrastructure to support these applications. The edge data center industry is still nascent and there is much more to learn as the uses for edge continue to be developed, but we are excited about the future opportunities that edge computing will create and the value that Iron Mountain can bring to this market.

RECAP: Review all the stories in our Data Center Frontier Executive Roundtable 

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below:

Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Image courtesy of EXFO

Navigating the Future: Upgrading Networks in Data Centers for 400G  

Nicholas Cole, Data Center Solution Manager at EXFO, explains why the journey towards 400G and beyond is not merely about keeping pace but also ensuring that every step forward...

White Papers

Get the full report

The Affordable Microgrid: Securing Electric Reliability through Outsourcing

Feb. 12, 2022
Microgrids, which use controllers to connect multiple power generation and storage sources, can provide electric reliability but they can also be too complex and costly for businesses...