Quantum Computing Advancements Leap Forward In Evolving Data Center and AI Landscape

Feb. 20, 2025
Barely two months into 2025, the quantum computing sphere is abuzz with activity. Here we examine some of the latest advancements and implications for data center and AI stakeholders, spanning from major emerging quantum players such as Microsoft and Softbank, and including the State of Maryland.

In DCF's annual 8 key data center trends forecast for 2025, we predicted that the drive toward quantum computing would be a definitive data center trend this year. In contrast, NVIDIA CEO Jensen Huang began the year by downplaying the near-term significance of quantum computing, emphasizing its immaturity compared to classical computing and casting doubt on its readiness for practical applications. 

In January, Huang stated that quantum computing was still in its infancy and that the technology was "not close" to being useful for real-world problems. He argued that classical computing, particularly with advancements in AI and GPU-accelerated systems, would remain the dominant force in solving complex computational challenges for the foreseeable future. Huang said that practical quantum computers are 15 to 30 years away from being useful. He made this prediction during a keynote at the 2025 Consumer Electronics Show (CES) in Las Vegas. The NVIDIA CEO's comments caused a significant drop in the stock prices of several quantum computing companies.

Huang's comments were seen by many as a pragmatic assessment of the current state of quantum computing, which, despite significant theoretical promise, has struggled with issues like error correction, scalability, and stability. Huang's skepticism was rooted in the practical limitations of quantum computing. Quantum systems, which rely on qubits to perform calculations, are highly sensitive to environmental interference and require extremely low temperatures to operate. These challenges have made it difficult to build reliable and scalable quantum computers. 

Huang pointed out that classical computing, as powered by his company's GPUs and AI-driven innovations, continues to deliver exponential improvements in performance, making it a more viable solution for most industries in the near term. However, recent developments in the quantum computing space have sparked renewed interest and debate about the technology's trajectory. In the months following Huang's comments, several breakthroughs have been reported by companies and research institutions.

Microsoft Unveils Majorana 1: A Leap Toward Practical Quantum Computing

Microsoft has announced a significant breakthrough in quantum computing with the unveiling of Majorana 1, the world's first quantum processor powered by topological qubits. This milestone marks a shift from theoretical exploration to tangible technological progress, moving toward scalable, fault-tolerant quantum computing.

The Power of Topological Qubits

At the heart of this advancement is the topoconductor, a novel class of materials engineered to enable topological superconductivity—a state of matter that previously existed only in theory. Microsoft’s approach leverages Majorana Zero Modes (MZMs), exotic quasiparticles that store quantum information in a way that protects it from environmental noise, enhancing stability and reliability.

Traditional quantum computers struggle with error rates due to their reliance on fragile qubits that require complex error correction. Microsoft’s topological qubits offer intrinsic error protection, significantly simplifying error correction and making large-scale quantum computing more feasible.

A Roadmap to Scalable Quantum Computing

Microsoft’s newly published research in Nature and its presentations at the Station Q meeting outline a structured roadmap to reliable quantum computation. The company’s fault-tolerant prototype (FTP), developed as part of the Defense Advanced Research Projects Agency (DARPA) Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program, is expected within years, not decades.

The roadmap includes:

  • A single-qubit tetron device: The foundational building block of the system.
  • A two-qubit system demonstrating measurement-based operations.
  • A scalable 4×2 tetron array: The next phase of development, enabling quantum error detection.
  • A 27×13 tetron array for quantum error correction: The foundation of a utility-scale quantum computer.

Revolutionizing Quantum Control

Microsoft's unique measurement-based quantum computing approach simplifies quantum error correction (QEC) by replacing complex analog control signals with precise digital pulses. This allows for a more practical and scalable system, essential for real-world applications.

The technology’s potential is vast—once scaled, a million-qubit quantum computer could unlock advancements in materials science, chemistry, and sustainable agriculture by accurately simulating quantum processes that classical supercomputers cannot model.

DARPA’s Endorsement and Future Implications

DARPA’s selection of Microsoft for the final phase of its US2QC program underscores the viability of this approach. The agency’s rigorous evaluation process has validated Microsoft’s engineering plan and fault-tolerant quantum computing architecture.

Microsoft’s commitment to accelerating quantum computing development aligns with its vision of a utility-scale quantum supercomputer, a machine capable of addressing some of the world’s most complex scientific and industrial challenges.

The Path Forward

Eighteen months after announcing its roadmap, Microsoft has now demonstrated the first topological qubit, with a system already containing eight qubits on a chip designed to scale to one million. With a clear technological path forward and backing from DARPA, Microsoft is well-positioned to drive quantum computing from scientific theory to transformative real-world applications.

Maryland and UMD Unveil $1 Billion "Capital of Quantum" Initiative to Lead Global Quantum Innovation

Maryland is making a bold move to establish itself as a global hub for quantum computing and innovation. Governor Wes Moore, alongside University of Maryland (UMD) President Darryll J. Pines and IonQ CEO Peter Chapman, last month announced the "Capital of Quantum" initiative—a $1 billion public-private investment designed to cement Maryland and the Greater D.C. region as a leader in quantum information science and technology.

A Strategic Investment in Quantum Science

With quantum computing poised to revolutionize industries from national security to healthcare, Maryland is seizing the opportunity to lead. The "Capital of Quantum" initiative will direct funding from state investments, federal grants, private sector contributions, and philanthropic donations toward advancing quantum research, workforce development, and commercialization.

"Quantum has the potential to transform every part of our economy and society," said Governor Moore. "With extraordinary assets and partnerships, Maryland can—and should—lead in this emerging sector. Together, we will make Maryland the quantum capital of the world."

Key funding components include an initial $27.5 million state investment in FY 2026, anticipated to catalyze over $200 million in matching funds from UMD and its partners. Additionally, $244 million has been allocated to the construction of Zupnik Hall, a state-of-the-art research facility with dedicated quantum labs.

Driving Innovation Through Research and Workforce Development

UMD is home to one of the largest concentrations of quantum researchers globally, with over 200 faculty members specializing in the field. The "Capital of Quantum" initiative aims to:

  • Recruit top quantum scientists and engineers from around the world.
  • Expand access to the National Quantum Laboratory (QLab), a collaboration between UMD and IonQ that provides hands-on experience with quantum computing.
  • Establish new testing and evaluation facilities for quantum research and national security applications.
  • Scale up the Quantum Startup Foundry, supporting entrepreneurs and startups bringing quantum technologies to market.
  • Launch educational and workforce training initiatives, including quantum curriculum for high schools, new master’s and certificate programs, and workforce retraining.

"We are deeply grateful to Gov. Moore for his visionary investment in building a brighter future for Maryland's economy," said UMD President Darryll J. Pines. "He recognizes the immense potential of quantum technology and the possibilities we can explore if we work together."

IonQ Expands Quantum Leadership in Maryland

As a key partner in the initiative, IonQ will expand its corporate headquarters in UMD’s Discovery District, developing a 100,000-square-foot facility that includes a data center, laboratories, and office space. IonQ also plans to double its workforce to at least 250 employees within five years.

"Investing in quantum computing is investing in Maryland’s future," said IonQ CEO Peter Chapman. "This initiative supports cutting-edge research and innovation while fostering economic growth and job creation in the state."

Quantum’s Economic and Technological Impact

Quantum technologies harness the principles of quantum mechanics to enable breakthroughs in computing power, secure communications, precision sensing, and material science. A 2024 McKinsey & Company report estimates that quantum technology could generate up to $2 trillion in economic value by 2035.

Maryland's initiative positions the state as a premier destination for quantum research, talent, and commercialization, ensuring that the region remains at the forefront of this transformative field. As the quantum era unfolds, the "Capital of Quantum" initiative represents a strategic commitment to innovation, economic development, and technological leadership. For more information, visit quantum.umd.edu.

SoftBank Expands Data Center Vision with Quantum Computing Partnership

SoftBank Corp. has taken another step toward reshaping the data center landscape, announcing a strategic partnership with quantum computing leader Quantinuum. The collaboration aims to accelerate commercial adoption of quantum computing, with an eye toward integrating quantum processors into next-generation data centers.

This move aligns with SoftBank’s broader ambitions in advanced computing infrastructure, including its recent investments in AI data centers and homegrown large language models (LLMs) optimized for the Japanese market. As AI workloads grow increasingly complex, SoftBank sees quantum computing as a key enabler for future computing architectures, helping overcome the limitations of classical AI processing.

The Quantum Data Center Vision

At the heart of the partnership is a joint initiative to develop a business model for quantum-enabled data centers. The companies will conduct global market research, beginning in Japan and expanding across the Asia-Pacific region, to evaluate demand and commercialization pathways for hybrid computing environments combining CPUs, GPUs, and Quantum Processing Units (QPUs).

Quantum computing is still in its early stages, with significant technical and business challenges ahead. However, SoftBank and Quantinuum are looking beyond theoretical applications, focusing on practical use cases that can deliver commercial value. Among these are quantum chemistry for material discovery—particularly for optical switch materials used in telecommunications—and network analysis applications such as fraud detection and anomaly detection in SoftBank’s communication network.

Overcoming the Barriers to Quantum Adoption

Despite the promise of quantum computing, widespread deployment faces multiple hurdles:

  • High Capital Costs: Quantum computing infrastructure requires substantial investment, with uncertain return-on-investment models. The partnership will explore cost-sharing strategies to mitigate risk.
  • Undefined Revenue Models: Business frameworks for quantum services, including pricing structures and access models, remain in development.
  • Hardware Limitations: Current quantum processors still struggle with error rates and scalability, requiring advancements in error correction and hybrid computing approaches.
  • Software Maturity: Effective algorithms for leveraging quantum computing’s advantages remain an active area of research, particularly in real-world AI and optimization problems.

SoftBank’s strategy includes leveraging its extensive telecom infrastructure and AI expertise to create real-world testing environments for quantum applications. By integrating quantum into existing data center operations, SoftBank aims to position itself at the forefront of the quantum-AI revolution.

A Broader Play in Advanced Computing

SoftBank’s quantum initiative follows a series of high-profile moves into the next generation of computing infrastructure. The company has been investing heavily in AI data centers, aligning with its "Beyond Carrier" strategy that expands its focus beyond telecommunications. Recent efforts include the development of large-scale AI models tailored to Japan and the enhancement of radio access networks (AI-RAN) through AI-driven optimizations.

Internationally, SoftBank has explored data center expansion opportunities beyond Japan, as part of its efforts to support AI, cloud computing, and now quantum applications. The company’s long-term vision suggests that quantum data centers could eventually play a role in supporting AI-driven workloads at scale, offering performance benefits that classical supercomputers cannot achieve.

The Road Ahead

SoftBank and Quantinuum’s collaboration signals growing momentum for quantum computing in enterprise settings. While quantum remains a long-term bet, integrating QPUs into data center infrastructure represents a forward-looking approach that could redefine high-performance computing in the years to come.

With the global demand for AI and high-performance computing on the rise, SoftBank’s commitment to quantum technology underscores its ambition to shape the future of computing. The partnership with Quantinuum could be "another first step" in making quantum data centers a reality, positioning SoftBank as a leader in the next phase of data center evolution.

Xanadu Unveils Aurora: A Scalable, Networked Approach to Quantum Data Centers

In a milestone moment for quantum computing, Xanadu has introduced Aurora, the first modular and networked quantum computer designed for large-scale deployment. This breakthrough advances the vision of quantum data centers, leveraging photonic technology to overcome one of the industry's key challenges—scalability.

Aurora is built on four independent quantum server racks, interconnected through 13 kilometers of fiber optics and utilizing 35 photonic chips. Operating at room temperature, the system eliminates the extreme cooling demands of many quantum platforms, a factor that could significantly streamline future data center integration. More importantly, Aurora’s architecture allows for near-unlimited scaling, potentially expanding to thousands of racks and millions of qubits—an unprecedented leap toward utility-scale quantum computing.

A New Model for Quantum Scalability

Historically, quantum computing has been constrained by both qubit fidelity and the challenge of scaling beyond laboratory prototypes. Xanadu’s photonic approach fundamentally changes the equation by employing a modular networked system that can be expanded using commercially available fabrication techniques.

“Aurora demonstrates that scalability—the biggest challenge in quantum computing—is now within reach,” said Christian Weedbrook, CEO of Xanadu. “With this architecture, we could, in principle, scale up to millions of qubits. Now, our focus turns to performance improvements, particularly in error correction and fault tolerance.”

The system builds upon Xanadu’s previous work with Borealis and X8, integrating error-corrected quantum logic gates and real-time error mitigation strategies. By enabling quantum computations through interconnected modules, Aurora represents a viable blueprint for the first true quantum data centers, a shift that could redefine enterprise computing in the years ahead.

The Path to Utility-Scale Quantum Computing

While Aurora's architecture provides a roadmap to large-scale deployment, further refinements are needed. Optical loss remains a key hurdle, with Xanadu now focusing on optimizing chip design and improving fabrication techniques in partnership with foundries. These improvements will be critical in ensuring fault tolerance, a necessary step for practical quantum applications.

For data center operators and cloud providers exploring quantum computing’s role in future infrastructure, Xanadu’s approach signals a potential turning point. The combination of room-temperature operation, scalable modularity, and networked computing could position photonics as a leading architecture for quantum-enabled data centers.

As the race toward fault-tolerant quantum computing accelerates, Aurora marks a significant step forward in bringing quantum systems out of the lab and into real-world enterprise environments.

At Data Center Frontier, we not only talk the industry talk, we walk the industry walk. In that spirit, DCF Staff members may occasionally employ AI tools to assist with content. This article was created with help from DeepSeek as well as Open AI's GPT-4.

 

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

About the Author

DCF Staff

Data Center Frontier charts the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there.

Sponsored Recommendations

Optimizing AI Infrastructure: The Critical Role of Liquid Cooling

In this executive brief, we discuss the growing need for liquid cooling in data centers due to the increasing power demands of AI and high-performance computing. Discover how ...

AI-Driven Data Centers: Revolutionizing Decarbonization Strategies

AI hype has put data centers in the spotlight, sparking concerns over energy use—but they’re also key to a greener future. With renewable power and cutting-edge cooling, data ...

Bending the Energy Curve: Decoupling Digitalization Trends from Data Center Energy Growth

After a decade of stability, data center energy consumption is now set to surge—but can we change the trajectory? Discover how small efficiency gains could cut energy growth by...

AI Reference Designs to Enable Adoption: A Collaboration Between Schneider Electric and NVIDIA

Traditional data center power, cooling, and racks aren’t sufficient for GPU-based servers arranged in high-density AI clusters...

Bloom Energy
Source: Bloom Energy

AI and Data Center Energy Demands: Are Fuel Cells the Answer?

Razvan Panati and Kaushal Biligiri of Bloom Energy explain why AI-driven data centers should consider on-site fuel cells to support their evolving energy needs.

White Papers

Dcf Imdcwp Cover 2023 01 11 17 19 43

Infrastructure Planning Report - EMEA - Frankfurt

Jan. 11, 2023
In this white paper, Iron Mountain Data Centers provides an overview of the German colocation market. It explores strengths and weaknesses of the market as well as the latest ...