Microsoft CEO Nadella: Underwater Data Centers Are the Future

Nov. 2, 2018
Microsoft CEO Satya Nadella says the company’s underwater data center, known as Project Natick, could play a key role in Microsoft’s future infrastructure.

Microsoft CEO Satya Nadella says the company’s underwater data center, known as Project Natick, could play a key role in Microsoft’s future infrastructure. Speaking at the Microsoft Future Decoded event in London, Nadella said that underwater infrastructure can bring Microsoft’s Azure cloud and edge technology to large numbers of future users.

“Since 50 percent of the world’s population lives close to bodies of water, we think this is the way we want to think about the future of data center regions and expansion,” said Nadella.

Project Natick represents a radical new approach to deploying data center capacity, which could enable Microsoft to shift its factory-built modular designs from earth to sea. When Microsoft deployed its initial Project Natick module in 2015 off the coast of California, many in the data center industry dismissed it as a science experiment – an impressive accomplishment, but not a candidate for production workloads at scale.

But Microsoft’s second-phase deployment in Scotland, which was disclosed in June, demonstrates that the company is serious about deploying underwater data centers as part of its strategy for edge computing, which brings data and applications closer to end users.

Microsoft’s 40-foot container houses 864 servers in 12 racks, resting on a seabed 117 feet below the ocean’s surface off the coast of the Orkney Islands, a chain of Islands in northernmost Scotland. The data center is unmanned, and powered by renewable energy through a cable that runs several hundred yards to land, which also provides network connectivity.

“It’s unique,” Nadella said of Project Natick. “It’s underwater, it’s self-contained, and it’s sustainable. It can take wind power, and it is very fast to build – from start to finish it was 90 days.”

The vote of confidence from Nadella suggests Microsoft is ready to invest in the concept. The project in Scotland is a larger implementation than the initial undersea deployment, and the next step in a development process the company outlined last year, which could culminate in pods of modules being aggregated to create a server farm of up to 20 megawatts or more.

Why the Edge Matters So Much to Microsoft

Microsoft’s experiment seeks to extend the frontiers of edge computing. In his presentation at Future Decoded, Nadella was clear that edge computing is a huge component of Microsoft’s infrastructure strategy.

“The most interesting thing that’s happening in the cloud is on the edge,” said Nadella. “As you put more computing power in the world, you need it to go where data is being generated.”

Microsoft envisions a future in which technology infuses almost every area of modern life, a trend the CEO called “tech intensity.”

“Every part of our economy is being digitized,” Nadella said, noting that computing is “deeply embedded in our life.” There are 9 billion microcontrollers sold every year, he said, which each constitute a compute node in a device.

Nadella said Microsoft is building its Azure cloud platform as a “world computer,” with cloud applications and software extensions to allow customers to build distributed Internet of Things applications that can easily connect with their cloud workloads and corporate IT infrastructure.

As it competes with Amazon Web Services and Google Cloud Platform in the cloud computing arena, Microsoft clearly sees its geographic diversity as a competitive differentiator. In his comments at the Future Decoded, Nadella emphasized that Microsoft’s 54 regions around the globe were the most of any cloud platform, making its services available in 140 countries.

The Project Natick design could, in theory, enable Microsoft to quickly deploy edge computing capacity near major population centers, which have often developed along the transportation system, including shorelines and maritime ports.

Thinking Differently About Cooling

Microsoft’s experiment continues the data center industry’s decade-long effort to harness the power of the sea to create sustainable data centers, tapping the waves and water to power and cool armadas of cloud servers. It ties together three data center trends we’ve been tracking here at Data Center Frontier – ocean-based facilities, the emergence of edge computing and unmanned data centers.

Engineers slide racks of Microsoft servers and associated cooling system infrastructure into Project Natick’s Northern Isles datacenter at a Naval Group facility in Brest, France. The datacenter has about the same dimensions as a 40-foot long ISO shipping container seen on ships, trains and trucks. (Photo: Frank Betermin for Microsoft).

On the Scotland deployment, Microsoft worked with French company Naval Group, a specialist in submarine and maritime engineering, to adapt a heat-exchange process commonly used for cooling submarines, piping seawater directly through the radiators on the back of each of the 12 server racks and back out into the ocean. The data center was assembled and tested in France and shipped on a flatbed truck to Scotland where it was attached to a ballast-filled triangular base for deployment on the seabed.

The new Project Natick module uses about 250kW of power, slightly more than terrestrial edge computing modules rolled out by several edge computing startups. Microsoft says the Orkney module will run for a year, and may handle customer workloads.

Brilliant or Crazy?

What does the rest of the data center industry think of underwater data centers and their viability? We got a sampling of opinions in our recent Executive Roundtable discussion, when we asked panelists whether this type of design innovation was brilliant or crazy. Here’s some of their responses:

Samir Shah, BaseLayer: “By deploying a modular and scalable data center underwater, Microsoft is not only utilizing deep-water cooling, but also solving the problem of finding space close to the end users . It’s a brilliant idea that solves space, cooling, latency, and sustainability problems of present and future. … From our view, Microsoft’s project pushes the limits of how we build, deliver, and maintain our critical data center infrastructure. In this regard, Microsoft should be applauded for their willingness to “push the envelope” as it relates to enterprise class data centers.”

Erich Sanchack, Digital Realty: “Microsoft is an extremely innovative company moving the data center market’s knowledge forward rapidly with prototypes like this. While we do see that there are risks associated with underwater data centers from a widespread, commercially-scalable perspective, we do believe there will be more shaping of the definition of the operating requirements as new cooling options are presented to the market. In terms of what they are able to learn, and share, regarding advanced power, cooling, structural considerations and other key data center issues, it’s a very bold and laudable endeavor and something we are eager to learn more about as it progresses.”

Jack Pouchet, Vertiv: “It’s brilliant for sure, albeit less than practical for most businesses. As a big fan of innovation, this does help push the design envelope for data centers. I do wonder if the next step is to just bury the data center in the ground. In most of the world, once you are 6 to 10 feet below the surface, the ground temperature is a stable 50 to 55°F. At least with an underground facility you can easily provide access shafts and tunnels. I’m not talking a cave here. There are plenty of those. I am suggesting you bury the entire container, leaving just an access shaft for IT / Facilities personnel.”

For the full discussion, see Underwater Data Centers: Pushing the Limits of Design Innovation?

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

kubais/Shutterstock.com
Source: kubais/Shutterstock.com

Exploring the Benefits of Liquid-to-Air Coolant Distribution Units (CDUs)

Kevin Roof, Senior Product Manager at nVent, outlines how liquid-to-air cooling works, its benefits and what data center operators should look for when designing and selecting...

White Papers

Dcf Venyu Wp Cover2022 03 31 16 31 33 1 232x300

The Future of Future Proofing

April 7, 2022
Venyu provides IT leaders with a framework for future-proofing their systems, networks, and partner ecosystems.