How Edge Compute is Shifting in the AI Era: A Vision of the Future

May 27, 2024
Kevin Imboden, Global Director, Market Research, and Intelligence for EdgeConneX, explores what edge deployment architecture might look like when AI models are in widespread production.

As the world grappled with the shock of the COVID-19 pandemic in 2020, cloud adoption and dispersion of compute to the Edge (i.e., closer to the end-user) became key topics; how best to migrate to a significant platform and how and where any remaining workloads should be maintained became the core focus of IT personnel around the globe. As our homes became the new Edge, where we worked and maintained business continuity via Teams, where we streamed Netflix or Hulu to remain sane and entertained, where we saw friends and family virtually via Zoom to stay connected, where our kids continued to learn remotely with their classmates and teachers, data centers and digital infrastructure were vital to enabling us having some sense of normalcy as we all sequestered from home.  

The advent of a host of AI applications has shifted industry focus yet again; instead of smaller, more dispersed workloads, the need to develop large language models via deep machine learning has caused a shift to extensive facilities in areas with low-cost power, with connectivity less of a focus until these models get deployed. Many operators shifted along with this pattern, with EdgeConneX being one of them: first deploying smaller workloads at the edge over ten years ago, then moving with client needs to construct far larger campuses for these newest workloads. The question for the next wave is what happens when these models are in widespread production (or inference per the industry buzzword), and what does that architecture look like?

Past edge deployments tended to have a connotation of a smaller size; only needed compute was completed closer to the user, with heavier work done elsewhere in one of several key hubs. Rack densities were similar across the entire operational portfolio, often between six and 15 kilowatts per rack, efficiently cooled with proper airflow and raised floors. Large language model training has led to a massive shift in density to develop these tools quickly and efficiently, with expectations of upwards of 120 kilowatts per rack over the next two years. While this has placed a strain on buildings with older designs (and forced the adoption of liquid cooling), very early returns indicate that once models are trained, inference needs require roughly half the density of training, with speed to user gaining importance.

Befitting the record leasing of the past four years, the edge buildout of the future will be much larger than previously and will have more of a mixed-use component; dense AI workloads will function near less sophisticated requirements, with software throughout the cloud platform and the data center driving operational efficiencies. Cooling requirements could vary depending on need, with flexibility to shift between liquid and air paramount and requiring intelligent design throughout the data hall. Geographic requirements will continue to move as well; instead of a dense urban location, as previously suggested at the edge, a more suburban location of a metropolitan area would allow for a larger building footprint and easier scalability as needs arise.

Adoptees of artificial intelligence applications will cause the edge to thrive, albeit as an evolution of the previous vision of distributed computing. Larger, denser workloads in key urban areas, complete with intelligent, efficient design options and the prospect of AI progressing the built environment, will lead to the next wave of the data center industry. EdgeConneX looks forward to assisting the industry in constructing this next wave of optimized facilities for distributed AI, whenever and wherever it is required, built to suit and built to density! 

About the Author

Kevin Imboden

Kevin Imboden is Global Director, Marketing Research for EdgeConneX after over a decade of study of the commercial real estate market for both the data provider and brokerage realms. He has produced a variety of research in primary and emerging markets, conducted both local and international market analyses, and appeared on many podcasts and at speaking engagement worldwide.

EdgeConneX is a global data center provider focused on driving innovation. Contact EdgeConneX to learn more about their 100% customer-defined data center and infrastructure solutions.

Sponsored Recommendations

Tackling Utility Project Challenges with Fiberglass Conduit Elbows

Explore how fiberglass conduit elbows tackle utility project challenges like high costs, complex installations, and cable damage. Discover the benefits of durable, cost-efficient...

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.