Record-Breaking 7x24 Spring Event Tackles Data Center Industry Challenges, Trends in AI, Power, Sustainability
From June 9th through 12th at the JW Marriott Grande Lakes in Orlando, Florida, a sizeable chunk of the data center industry gathered to take in the high-level educational and networking opportunities on offer at the 7x24 Exchange 2024 Spring Conference (along with many chances for fun, sun, and copious food and beverage).
Primarily embodied by the event's keynote and panel discussions, this year's Spring Conference, marking 7x24 Exchange's 35th anniversary, "didn't just break, but shattered" the association's records for growth in attendance and sponsorships at consecutive conferences, according to 7x24 Exchange Chairman and CEO Robert J. Cassiliano.
"We've had five consecutive conferences with record attendance and record sponsorships," said Cassiliano in opening remarks.
He added, "This spring, at 1200 plus, has a higher number of attendees than the previous fall conference in 2023, which had over 800 -- that is only the second time in 35 years that has happened. The last time that happened was after 9/11 when a lot of people from the Northeast couldn't go to that Fall conference. So this is truly a unique change in dynamic here - it's pretty phenomenal."
In his introduction, 7x24's chief also emphasized how the conference has always focused on principles of sharing, learning and networking. "The conference's objective is for you to gain a better understanding of data center topics, share experiences and knowledge, and learn best practices and ways for practical application," said Cassiliano, while noting that the annual event is also "an excellent networking opportunity - 7x24 Exchange is all about information sharing."
Industry's Rapid Growth Comes at a Cost
From the podium at the conference kick-off, 7x24's Cassiliano noted how: "According to McKinsey analysis, data center growth projections, as measured by power consumption, are expected to reach 35 gigawatts by 2030, up from 17 gigawatts in 2022."
While acknowledging that the data center industry has done much to address energy efficiency - for example pointing to the substantial gains in power utilization efficiency (PUE) achieved since roughly 2017 - Cassiliano warned, "But progress has flattened over the past decade, and on water conservation, the use of river projects and aquifers has met with resistance from local communities and government regulation."
He continued, "AI will require an increase in computer performance, resulting in greater chip density and additional heat generation. Chip density will challenge manufacturers to extend Moore's Law, and removing the additional heat production will require new and innovative designs in cooling technology."
For those professionals in the room as well as abroad in the industry, Cassiliano concluded, "The challenge with respect to data center growth will be the ability to have the required technical skills to design, build, operate, and maintain these mission-critical facilities" amid such substantial hurdles.
In light of the complex trends facing the industry, Cassiliano also reminded attendees how "International Data Center Day powered by 7x24 Exchange is addressing the challenge through its mission to create awareness of the data center industry and inspiring the next generation of talent."
Thirteen universities and international representatives attended this year's Spring Conference event. Many of the collegiate attendees took part in 7x24's Data Center 101 course, which was offered on the first day of proceedings.
Data Center Industry Challenges and Trends
The overarching theme of the 7x24 Exchange 2024 Spring Conference was data center "Industry Challenges and Trends," centrally regarding factors of artificial intelligence (AI), sustainability, and data center growth.
In his opening talk, Cassiliano noted, "AI technology is being adopted by many organizations around the world, and in particular for technology companies, there is continuing and more intense focus on energy efficiency, water conservation, and sustainability."
Such principles, especially as focused on by hyperscalers, were reflected at the conference by sessions including a highly detailed technical talk by a trio of data center experts from Meta, including James Monahan, Sr. Solutions Engineer, Data Center Campus Facility Director Laura Nofzinger, and Research Scientist Lisa Rivalin, entitled, "Developing and Utilizing Physics-Based Models to Mitigate Ever Changing Environmental Risk."
As further illustrative of the conference theme, for his far-ranging Tuesday keynote, Aligned Data Centers' Chief Innovation and Technology Officer, Phill Lawson-Shanks took the stage to present on the topic of "Going Clean: A Look at the Future of Data Center Power."
And in his closing keynote, JLL's Vice President of Data Center Strategy, Sean Farney, circled back to the event's AI focus with regard to the finer points of "Solving Data Center Sustainability and AI Density Challenges with Adaptive Reuse."
OpenAI Savant's Keynote Contextualizes AI for Data Center Stakeholders
The conference keynote speaker was Zack Kass, former head of go-to-market for OpenAI, the increasingly ubiquitous AI research organization founded in December 2015. Headquartered in San Francisco and led by CEO Sam Altman, OpenAI's stated mission is to develop "safe and beneficial" artificial general intelligence (AGI), which the company defines as "highly autonomous systems that outperform humans at most economically valuable work."
His time spent working in OpenAI's headquarters made Kass acutely aware of the impact AI technology will have on the data center industry. Now in his role as a futurist and visionary evangelizing the technology for the public as a keynote speaker, boardroom advisor and adjunct professor at the University of Virginia, Kass contends, "We are on the verge of the most profound industrial revolution in human history. I've had the privilege of seeing the future, and I'm here to tell you that it's not just going to be okay, it's going to be amazing."
Kass said that approximately five years ago, he became "obsessed" with the possibilities of AGI.
"This is the theory that says at some point, AI will have human intellectual equivalents. If AGI arrives, or when it does, in my opinion, it probably marks the last technology that humans will ever invent on our own. That's profound for all sorts of reasons, because it creates what we call a recursive learning loop. At some point, you get smart enough that you get smarter much faster. This is sort of how people like Isaac Newton start to come up with all sorts of incredible ideas, because one idea begets another. Except that Isaac Newton had a very clear intellectual ceiling, where AGI probably does not."
Kass continued:
"We evolved societally on step functions. We demarcate these step function improvements by the technology that usually marks them, the wheel, fire, aqueduct, printing press, electricity, Internet. My argument, quite simply, is that we will talk about AGI in 100,000 years the way we talk about fire today: that it is one of these ultimate demarcations in our existence."
On progress toward such a world, as Kass told 7x24 Exchange attendees, "We're much, much further than you realize - the technology on your phones today is by many accounts old already and sort of boring."
Kass emphasized that companies are increasingly getting much better at using AI technology. Citing a McKinsey study he took part in authoring, Kass said the consulting firm's conservative estimate is that by 2026, companies will see a 30% profit margin improvement from using AI. Kass asserted, "I actually think it's going to destroy that number. I think businesses are about to get much more efficient, much, much quicker, and as a result, there will be fundamental improvements in the market."
He went on, "The S&P 500 has gained $6 trillion in market cap since GPT-4 launched. That's about 30% ... It's a proxy to what will happen in the world when other industries start to adopt this technology. I guarantee you we are going to see an incredible improvement in market performance on a fundamental basis."
The Transformer
Later in his talk, Kass went on to describe the impact and implications of the renowned 2017 paper entitled, "Attention Is All You Need," authored by eight Google researchers who have since become known as the "Tranformer 8."
The paper is credited for originating the idea of modern transformer, i.e. the neural network architecture that makes generative artificial intelligence possible. Kass continued:
"This is the thing that catapults us forward more than anything else. In that paper, the Transformer 8 make a simple argument, which is that we are building AI models all wrong. That instead of building statistical machine learning models that think in straight lines, we should be building deep learning models that think in parallel.
Deep learning, of course, has been around since 1950s, but no one's ever given it a lot of credence because it took way too much compute and way too much data, and no one could figure out how to make it work, except for a small group of people in 2012 that started to see results. And then they put the project down because statistical machine learning was [focused on] linear performance.
Well, it turns out this is exactly how our brains think: The parallelization of data is the closest we've ever come to starting to build things that work like our brains. That's what the transformer is.
And the transformer came at just the right time because we saw an explosion in the amount of evidence: compute, driven largely by Nvidia's breakthrough GPU [the GH100], but also by the zero interest rate environment. So you have companies like OpenAI able to raise a sovereign nation amount of money and then back the truck up into Nvidia's parking lot.
It comes at the same time that the Internet grew four orders of magnitude in ten years between 2010 and 2020. And it got a lot cleaner, too, got a lot richer, largely because Meta and Google wanted to serve us all better ads. But the byproduct of that is that now these researchers have this treasure trove, a thousand or even 10,000 times bigger than they previously expected.
Then if you stir that pot with this idea that all of the people that that work at these companies are fundamentally interested in sharing their best practices with each other - this core belief in research and publication - you arrive at between 2015 and 2021 when these models got a lot bigger and a lot better.
We measure large language model performance by precision, sized by billions of parameters, and the models went from about 250 million parameters to about 100 billion parameters. In 2021, most of you had not heard of OpenAI, had not heard of GPT-3, and largely did not care about what was going on. But then something really exciting happened, which is this trend just continued on the same super-linear path. We saw the exact same trend, which is that large language model size grew quite a bit, and as a result, performance did too.
GPT-4, we now know, is about 1.8 trillion parameters, very big and very good. Because it's a lot better, the older models have gotten a lot cheaper and suddenly we have applications like ChatGPT that bring the world to the market. But more important than this, I think, is that between 2021 and 2024, these models are getting very cheap, and we see no signs of that not continuing.
In my mind, this is the most exciting trend in the industry, because this suggests not just ubiquity, but fair distribution of this technology in a world that is not owned by a technological hegemony.
As a result of this technology getting so good so quickly, I think we are headed towards an abundant future. I believe in my bones that we are headed toward a world where humans no longer have to compete economically for basic goods and services.
I think we achieve AGI by 2030. This is my bold date and prediction and honestly, I think it's a little conservative at this point. I've heard as early as 2028 by some very, very smart people in the industry."
Footnote on Nuclear Energy for AI Data Centers
Recalling the enthusiasm surrounding renewed nuclear energy prospects for the industry, for instance as overheard in both hallway and dais conversations at Data Center World earlier this Spring, DCF's correspondent approached the microphone during the Q&A portion of the talk to ask Kass for his views on the emerging "new nuclear" horizon in support of data center AI stakes.
In response, Kass offered:
"I'm not nuclear physicist, but we are pretty sure at this point that we're probably five to ten years away from a major fusion breakthrough. We had a big one two weeks ago; we suspended a particle for a brief amount of time - but long enough to consider it a major breakthrough.
Eventually, obviously, we're going to be running data centers that are so much more compute intensive today than we could imagine. And the likelihood is that the way that we will figure out how to distribute them more effectively is by figuring out how to distribute the energy more effectively.
It's also obvious that GPT-6 and 7 are going to require so much energy that it would require probably a solar field the size of Arizona to power them. That presents not just a technological problem, it presents a national security issue. So distributing power sources is pretty critical."
Matt Vincent
A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.