The Year Ahead: Edge Computing Creates Opportunity in 2017

Jan. 6, 2017
Edge computing is evolving beyond caching, and will require more compute power and intelligence to manage the movement of data, say analysts, who predict this will create business opportunities for data center providers in 2017.

As tech pundits and executives look to the horizon, they see huge demand for edge computing. What they don’t yet see is the infrastructure to support it.

That may change in 2017, as the potential demand drivers for edge computing – which include the Internet of Things, virtual reality and connected cars – gain broader market acceptance. As these predictions materialize into demand for real-world storage and network capacity, the data center industry will reckon with the challenges and opportunities of deploying capacity at the edge of the network.

“You’re going to see some specific vendors getting involved in this, and cloud players getting involved with this,” said David Cappuccio, Distinguished Analyst and Chief of Research at Gartner. “Some companies will do it themselves.”

Analysts agree that emerging applications will generate significant growth in distributed data, and foresee several impacts in 2017:

  • Distributed sensors and applications will require more data storage and compute capacity in secondary markets.
  • As more distributed data is generated, constraints on network capacity may emerge.
  • To reduce impact on the network, some data crunching and analytics will move closer to the edge, with smaller data streams being forwarded to data centers in major regional hubs.
  • Companies will use artificial intelligence to automate the management of this data.

As background, we’ve previously discussed the broad outlines of edge computing and how it will impact data center geography and network infrastructure, as well as in our executive roundtable. Today we dig into some specifics on how this trend may shape industry dynamics in 2017.

Storage and Compute

The trend driving the edge computing model is the increased use of consumer mobile devices – especially consumption of video and virtual reality content – and the growth of sensors as part of the Internet of Things.

“IT is now becoming a utility,” said Chris Crosby, the CEO of Compass Datacenters, which focuses on building data centers in secondary markets. “Its ubiquity is emerging before our eyes. ‘How does this magic happen?’ has become ‘why doesn’t this thing work?’ It’s a generational shift. Nobody has a problem watching video on their phone. You can watch it in HD.”

That creates a challenge for U.S. data center infrastructure, which is concentrated in six major business markets, with additional infrastructure in some smaller cities and some cloud hubs in rural areas.

“You’ll start seeing the edge moving out to smaller nodes in wired cities to hand off traffic from those smartphones and mobile devices,” said Phill Lawson-Shanks, the Chief Architect and Vice President of Innovation at EdgeConnex, one of the providers that has focused on the edge computing opportunity. The company has built a network of 25 data centers in secondary markets, including many that can operated as unmanned “lights out” facilities.

“It’s really about reimagining and realigning public networks,” said Don MacNeil, the chief operating office of EdgeConneX. “The Internet has grown in ways we never imagined. We need to bring content to the eyeball networks. We want to provide a platform to continue to build upon. We need to bring content to the eyeball networks.”

A key concern is right-sizing data centers for the edge and matching capacity to demand. Mistakes can be expensive, as data center construction costs range from $5 million to $15 million per megawatt of capacity. As data moves to the edge of the network, data centers are being right-sized to fit the demands of these new markets, deploying space in digestible chunks. That’s why modular design and lean construction methods are playing a major role as the data center industry looks to deploy capacity in new places.[clickToTweet tweet=”Don MacNeil of EdgeConneX: It’s really about reimagining and realigning public networks.” quote=”Don MacNeil of EdgeConneX: It’s really about reimagining and realigning public networks.”]

“Edge requirements could be encapsulated in an ‘intelligent rack’ with integrated power, located close to the user, outside a traditional data center,” said Enzo Greco, VP and General Manager of Data Center Solutions for Vertiv (formerly Emerson Network power). “A rooftop could be ideal in many scenarios, others might require rack for smaller spaces. For some customers, it’s a smaller data center. There will be different form factors for power and cooling.”

Edge infrastructure may also be deployed at the base of wireless towers operated by telecom companies and real estate investment trusts (REITs). Vapor IO has developed a solution that combines an IT enclosure and software to manage data movement, which can be used to deploy 150 kW of IT capacity at the base of a tower.

“Data is so prevalent that we have to move the compute to the data, rather than moving all the data to the compute,” said Cole Crawford, CEO and co-founder of Vapor IO.

Network Impact

Compass CEO Chris Crosby predicts that network infrastructure will struggle to keep pace with the data growth at the edge.

“We have some cataclysmic things happening with latency and bandwidth,” Crosby said in a presentation at the 7×24 Exchange Fall Conference in Phoenix in November. Crosby noted the differences between latency – the time to travel between two points – and bandwidth, which is a measure of network capacity.

“We’ve got a lot of pieces competing for both bandwidth and latency,” said Crosby. “It’s easy to solve bandwidth and easy to solve latency, and hard to solve both at the same time.

Chris Crosby, the CEO of Compass Datacenters, says that growing data traffic is placing pressue on network infrastructure. (Photo: Rich Miller)

“There hasn’t been investment in decades, from a network perspective,” Crosby continued. “Now we have a scenario where there are too many potential choke points. I don’t have a separate network for the (workloads) that need to have performance. We have this one network, and now there are 1,000 more access points.”

Crosby sees the data center industry segmenting into three tiers: huge core data centers, smaller facilities in second-tier markets, and micro-data centers at the edge – which could be a remote tower site or a space inside an office building.

“Edge is basically taking all the server closets you guys worked so hard to get rid of, and putting them all back in again, only with intelligence,” he said.

Providers targeting this opportunity include DartPoints, which develops micro-data centers for office buildings (“data-enhanced properties”)and small colo facilities.

For most edge-watchers, the critical question isn’t whether edge demand will increase, but when. ” I think the key question is ‘can you get the timing right?'” said Crosby.

Analytics at the Edge

As edge computing creates demand for different types of facilities in different places, the design and capacity of local infrastructure will be guided by the workloads. Phill Lawson-Shanks of EdgeConneX says the major priority is boosting latency for content deliver, but that requirements will change over time.

“The latency benefit is from the caching,” he said. “Caching can happen off-peak and out-of-band. Ultimately, you’ll see processing and storage.”

Gartner Distinguished Analyst David Cappuccio said a growing number of organizations are seeing opportunities in edge computing. (Photo: Rich Miller)

Gartner’s Cappuccio agrees that compute hardware will move closer to the edge. “The amount of data being generated is staggering,” Cappuccio said in a presentation at the Gartner Data Center conference in December. “In some cases, I may want to analyze it close to the source, so I’ve got to have that edge environment close to the source.”

Smaller enclosures and micro-data centers will be a key part of the solution.

“Micro-data centers have been around for a long time,” said Cappuccio. “We just don’t call them that. Edges may emerge to serve specific apps or specific customers. It will be like managing a wide distributed network, like in the 1980s, but with way more intelligent devices.”

AI to Manage It All

One company that sees a huge opportunity in smarter devices on the edge of the network is NVIDIA, which has staked out a leadership position in distributed high performance computing (HPC), especially for the automotive market.

Jim McHugh, vice president and general manager at NVIDIA, believes the shift of analytics to the edge will bring an accompanying expansion for HPC hardware.

“You’re going to be putting a lot of computational power at the edge,” said McHugh. “The type of data we’re dealing with the world of the Internet of Things can’t be processed manually. It has to be automated, and AI will play a large role in this.”

NVIDIA’s Jim McHugh sees many developing technologies that will boost demand for HPC in edge environments. (Photo: Rich Miller)

Artificial intelligence involves two types of computing workloads with different profiles, known as training and inference. Both involve neural networks – groups of computers that mimic the way neurons work together in the human brain.

  • In training, the network learns a new capability from existing data. Training is compute-intensive, requiring hardware that can process huge volumes of data.
  • In inference, the network applies its capabilities to new data, using its training to identify patterns and perform tasks, usually much more quickly than humans could.

“The world of the Internet of Things is going to be changing drastically with AI because it allows us to train applications at the edge, so they are intelligent,” said McHugh. “A big part of training is data locality.”

Vapor IO is developing software to manage workloads in distributed edge environments, automating real-time optimization decisions. That includes rescheduling workloads onto edge nodes or centralized data centers, based on policies, including latency and cost.

“We see the future as being mildly intelligent at the edge, increasingly intelligent at the colos, and using big hyperscale data centers for the machine learning and analytics,” said Crawford.

The Bottom Line: More Data Centers

Providers of edge-focused data centers are bullish on the opportunity. “This is just the beginning of a very large surge in the edge computing market,” said Phillip Marangella, vice president of business development at EdgeConneX.

As the volume of data grows, and that data moves across the network, the growth of edge computing creates a ripple effect that could generate business, extending hundreds of miles from the edge facilities.

“With the increased adoption of latency-sensitive, data-intensive technologies — running the gamut from mobility, the Internet of Things and content delivery to future technologies like self-driving cars and augmented/virtual reality applications — well-connected real estate near critical population centers is poised to enjoy above-average demand and pricing over the long term,” said Pat Lynch, managing director, Data Center Solutions, CBRE.

In his presentation about network challenges, Crosby succinctly summarized the bottom line for the edge computing trend.

“The only message you should be getting from me is that there will be more data centers,” said Crosby.

Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Runawayphill/Shutterstock.com
Source: Runawayphill/Shutterstock.com

How A Cross-Company Initiative Is Telling the Story of a New, Collaborative Future for Data Centers

A group of the industry’s top talent have been working as a special task force to address a pivotal issue facing the data center sphere at large. With their upcoming publishing...

White Papers

Dcf Siemon Sp Cover 2022 02 28 17 26 36 232x300

The Journey to 400/800G has Begun

March 3, 2022
Siemon explains the factors data centers should consider when determining which path to 400G they should take.