“We were promised flying cars, and we got 140 characters.” By now, it’s an old aphorism, but the point is simple: In many ways, today’s reality does not match the future that people envisioned many years ago.
The fact is progress is often iterative, surprising, and non-linear. Cars today are cleaner, more efficient, safer, and less prone to rapid obsolescence, by far, than they were in the past. They can’t fly, but they are safer and smarter, and don’t spew nearly as much carbon into our air as they used to. In the US, greenhouse gas emissions from passenger cars dropped by 46% between 1990 and 2020, even as registered motor vehicles rose by nearly 50% over that same period. Progress has been slow and steady for cars, but a driver time-traveling from 1990 might not believe what today’s cars routinely do.
For data centers, the progress is faster and more pronounced. And time travelers from 1990 might be astonished to see today’s global platform of Internet-enabled data centers.
Twenty years ago, data centers largely formed a patchwork of enterprise-owned facilities built close to company offices and colocation deployments typically placed at Internet peering points with CDNs helping to push website content to end users in distant locations. In fact, in 2010 79% of data center compute instances were found in small, traditional data centers and by 2018 that had reversed to the point that 89% of compute instances were in larger facilities including hyperscale deployments.
Today, data center providers are serving a vast array of industries including Cloud, Content, Finance, Healthcare, Gaming, Retail, Transportation, and more, in deployments sometimes ranging into 100s of MW, in markets large and small, around the globe.
Clean and Efficient Scale
In 2024, no one questions the growth curve of data center capacity, reach, or scale. But concurrent to this growth, the data center industry has seen phenomenal innovation that has allowed these new deployments to operate using less electricity while generating lower carbon emissions. What is most remarkable is the fact that the percentage of electricity used by data centers has not significantly increased over the past decade even as the computing power, storage, and network traffic has multiplied, in some cases by as much as 500%.
In 2020, a team of scientists and engineers published a paper summarized in the journal Science illustrating the improvements in data center operations, including evidence that a large decrease in the energy use of cooling and power provisioning, coupled with more efficient chips and servers, was off setting the increases in the sheer numbers of servers and other equipment inside the buildings.
Effectively, they concluded that newer, larger data centers were also proving to be cleaner, more efficient, and more innovative than as they replaced the higher number of older, smaller, less efficient deployments that were, over time, becoming obsolete.
Pathways to Efficient Operation
Today’s large-scale data center deployments are driven primarily by Cloud and AI customer demand for delivering high performance computing solutions to business customers and consumers in markets around the planet. But in many ways, these larger facilities are also the smartest, cleanest, most innovative way to provide data center services.
In February of this year, The Economist magazine looked at the data center industry’s progress in operating efficiency and clean energy paying special attention to the impacts of AI-related processing. Some revealing stats cited by The Economist show:
- AI servers could use ~100 TWh of electricity annually by 2027, consuming between 20% and 25% of total data center power consumption
- Training newer generations of AI LLMs uses over 50 times more electricity used by previous generations
- The energy used in the inference phases of AI modeling is projected to be much higher than the amounts used in training those models
Pathways to Efficient Innovation
The data center industry will need to continue to drive efficiencies that offset the growing energy needs of AI computing. It will also need to identify and implement innovative solutions for every aspect of data center deployments.
Some examples: Lower-carbon concrete and steel can reduce a building’s overall carbon impact. 24/7 Carbon Free Energy provides near real-time confirmation of clean power supplies, ensuring greater use of wind, solar, hydro, geothermal, nuclear, and other carbon-free sources. Battery storage can help ensure high availability in the data center while also helping with decarbonization of local grids. Newer cooling and HVAC systems operate more efficiently, liquid cooling can reduce the total power consumption in a data center by 10%, and innovative water and waste solutions can significantly reduce the impacts a data might have on its surrounding communities.
The key to a lot of this innovation going mainstream is found in larger scale deployments. It is no accident that the evolution from small data centers to large-scale facilities between 2010 and 2018 coincided with a remarkable improvement in energy efficiency and reduced emissions.
Looking ahead, Cloud, AI, and other high-performance computing services will require larger-scale data centers that may face resistance in some markets. But they will offer more opportunities to implement superior solutions that may be too expensive to deploy in smaller data centers.
We still don’t have flying cars. And when you think about the consequences of a fender bender at 1,000 feet, maybe that’s not such a bad thing. But the improvements we have seen in our cars over the past few decades have been made practical and economical through the benefits of scale – being installed in millions of vehicles.
In data center development, larger scale is inevitable, even if the physical footprint doesn’t expand as rapidly as the capacity for power, storage and computation. Scaling efficiently, sustainably, and economically depends on multiple factors from construction materials and HVAC equipment to chips, servers, and power supplies. But we have seen in the past 15 years that these diverse entities can combine to create data centers that are cleaner, more efficient, and friendlier to their neighboring communities than anyone would have predicted.
The opportunity – and the obligation – going forward will be to extend these improvements, deliver better results, and convince even the skeptics that data centers are essential components in a global infrastructure that is accelerating the growth and innovation we are seeing in transportation, healthcare, education, and virtually every other industry that touches our daily lives.
Phillip Marangella
Phillip Marangella is Chief Marketing and Product Officer for EdgeConneX. EdgeConneX, is a global data center provider focused on driving innovation. Contact EdgeConneX to learn more about their 100% customer-defined data center and infrastructure solutions.