Microsoft’s Undersea Data Center: Brilliant or Crazy?

Feb. 1, 2016
Microsoft has built and deployed a submarine data center, running servers on the ocean floor for three months in a submersible container. The research prototype, dubbed Project Natick, is part of Microsoft’s ongoing quest to find affordable ways to deploy sustainable cloud data centers.

Are you ready for the cloud beneath the sea? Microsoft has built and deployed a submarine data center, running servers on the ocean floor for three months in a submersible container. The research prototype, dubbed Project Natick, is part of Microsoft’s ongoing quest to find affordable ways to deploy sustainable cloud data centers.

Project Natick represents a radical new approach to deploying data center capacity, which could enable Microsoft to shift its factory-built modular designs from earth to sea. In an era of exciting advances in data center design, Microsoft’s experiment seeks to extend the frontiers of edge computing, bringing cloud capacity closer to population centers concentrated along coastlines around the world.

“Moving data centers to the ocean made a great amount of sense to be able to make the cable to our customers as short as possible,” said Microsoft Research Engineer Jeff Kramer. “Natick could have a lot of impact, both currently and into the future.”

Microsoft’s experiment continues the data center industry’s decade-long effort to harness the power of the sea to create sustainable data centers, tapping the waves and water to power and cool armadas of cloud servers. It ties together three data center trends we’ve been tracking here at Data Center Frontier – ocean-based facilities, the emergence of edge computing and unmanned data centers.

Significantly, Microsoft kept Project Natick under wraps until it had been successfully deployed, placing it in a different league from previous visions of seagoing data centers from Google and others. This shifts the project beyond the hypothetical. The question is no longer whether it can be done, but the feasibility of the design’s economics, scale and use cases.

“The overall goal here is to deploy data centers at scale, anywhere in the world, from decision to power-on within 90 days,” said Ben Cutler, Microsoft’s Project Manager with the Natick team.

An Undersea Moonshot

Project Natick’s origin story dates to 2013 and a research paper from Microsoft’s Sean James proposing an underwater data center powered by renewable ocean energy. James had served in the Navy for three years.

“What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water,” said James. “It goes through a very rigorous testing and design process. So I knew there was a way to do that.”

In 2014 Microsoft Research created a team to explore the feasibility of the concept, which led to the creation of a submersible data container containing a single rack, dubbed the Leona Philpot (named for a character the popular Xbox game Halo, who broke her neck diving into a pool but became homecoming queen).

The 38,000 pound, 10 foot by 7 foot container was deployed last August 15 in about 30 feet of water off the coast of California. The Microsoft team sealed the servers inside the container, and monitored them as the data center operated for 105 days on the ocean floor. It was retrieved in November and trucked back to Microsoft headquarters in Redmond, Washingon for further analysis.

Microsoft describes the initial voyage as “very successful,” with no leaks or hardware failures, allowing the researchers to extend the project and even run live workloads from its Azure cloud.

The Project Natick web site didn’t include any details on the power or cooling setup, but the New York Times noted that the Leona Philpot featured “a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips.”

Here’s a video from Microsoft that provides more information:

The Power of the Ocean

Microsoft’s effort is a new take on an old idea: using the sea to power and cool a data center, transforming both the economics and sustainability of cloud computing platforms. The concept dates to 2007, when Google gained a patent for a water-based data center, stirring visions of a fleet of futuristic offshore data havens powered and cooled by the waves. The company has never built the sea-going “Google Navy” described in its patents, but other companies have pursued the idea.

The most recent is Nautilus Data Centers, which has created a floating colocation facility on a barge moored on the San Francisco Bay. Nautilus says it has successfully tested a design for a floating data center that can dramatically slash the cost of running IT operations.

The primary advantage of a maritime data center is the ability to slash costs by using water to power or cool the data center, and avoiding the expense of real estate and property taxes. These ideas build on previous seagoing IT operations – both the U.S. Navy and major cruise lines have maintained sophisticated telecom and IT infrastructure for decades – and add power and cooling technologies that can slash costs.

Harnessing Wind and Waves

Microsoft’s Project Natick used the grid power for its prototype, but the company is working on next-generation designs featuring larger containers that can be powered with turbines powered by the waves or tides.

The Project Natick enclosure is lowered over a rack of Microsoft Azure cloud servers. (Photo: Microsoft Corp.)

The company also sees the submarine data center project as an opportunity to rethink many of the form factors that have traditionally been used for servers and storage, which must account for the need for humans to access the equipment and replace components or refresh servers. Operating in full unmanned “lights out” mode allows new approaches that need account only for heat removal, rather than access.

Microsoft says the Natick containers are designed to operate as unmanned units submerged for up to five years at a time. An interesting wrinkle is that the company believes it may be able to go five years in production without refreshing its servers. Most hyperscale providers refresh their servers and processors every three years (as noted recently by research from Coolan).

“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” the Project Natick team said in its FAQ. “We see this as an opportunity to field long-lived, resilient datacenters that operate “lights out” – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as 10 years.”

The Road Ahead

It’s easy to be skeptical about the potential for underwater data centers. There are many potential obstacles, including the long-term corrosive impact of sea water on the components.

But Microsoft now has data and a working prototype, as well as a history of converting these “moonshot” ideas into megascale data center operations. One of Sean James’ previous brainstorms plays a central role in this history.

In 2008, James and Microsoft colleague Christian Belady ran a single rack of servers in a tent in the fuel yard for one of the company’s data centers for seven months, with no equipment failures. The experiment proved that servers were hardier than believed, clearing the way for Microsoft to envision running containerized servers in the outdoors.[clickToTweet tweet=”Microsoft: The goal is to deploy data centers at scale, anywhere in the world, within 90 days.” quote=”Microsoft: The goal is to deploy data centers at scale, anywhere in the world, within 90 days.”]

Microsoft has since deployed tens of thousands of servers – and perhaps hundreds of thousands – in modular known as IT PACs that sit outdoors on cement pads on Microsoft data center campuses in Washington state, Iowa and southern Virginia.

Belady now heads Microsoft’s data center operations, and has advanced a number of innovative projects combining portable computing and renewable energy, like data plants and landfill-powered containers.

“The reality is that we always need to be pushing limits and try things out,” said Belady. “The learnings we get from this are invaluable and will in some way manifest into future designs.”

Microsoft says its still “early days” in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.

“This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,” says Norm Whitaker, who heads special projects for Microsoft Research NExT. “There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to data center partners. In a difficult situation, they could turn to this and use it.”

“It’s not a moonshot in the sense that it’s just this outlandish thing,” said Spencer Fowers, a researcher on the Natick team. “It’s actually a viable product that we could make.”

The Microsoft Project Natick team with the Leona Philpot underwater data center prototype. (Photo: Microsoft)

Is this genius or madness? What’s your opinion? Share your thoughts in the comment section.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Julia Ardaran / Shutterstock.com

Beyond Hyperscale: Quantum Frederick's Vision for Sustainable Data Center Development

Scott Noteboom, CTO of Quantum Loophole, explains how Quantum Frederick created the first carbon neutral industrial zone.

White Papers

Get the full report

The Data Center Human Element: Designing for Observability, Resiliency and Better Operations

March 31, 2022
To meet the new demands being placed on data centers, industry leaders must rethink the way they approach their environment, delivery model and how they can leverage the cloud...