• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Cloud / Microsoft’s Undersea Data Center: Brilliant or Crazy?

Microsoft’s Undersea Data Center: Brilliant or Crazy?

By Rich Miller - February 1, 2016

Microsoft’s Undersea Data Center: Brilliant or Crazy?

Microsoft's experimental underwater data center, the Leona Philpott, is lowered into the ocean in August 2015. (Photo: Microsoft Corp.)

LinkedinTwitterFacebookSubscribe
Mail

Are you ready for the cloud beneath the sea? Microsoft has built and deployed a submarine data center, running servers on the ocean floor for three months in a submersible container. The research prototype, dubbed Project Natick, is part of Microsoft’s ongoing quest to find affordable ways to deploy sustainable cloud data centers.

Project Natick represents a radical new approach to deploying data center capacity, which could enable Microsoft to shift its factory-built modular designs from earth to sea. In an era of exciting advances in data center design, Microsoft’s experiment seeks to extend the frontiers of edge computing, bringing cloud capacity closer to population centers concentrated along coastlines around the world.

“Moving data centers to the ocean made a great amount of sense to be able to make the cable to our customers as short as possible,” said Microsoft Research Engineer Jeff Kramer. “Natick could have a lot of impact, both currently and into the future.”

Microsoft’s experiment continues the data center industry’s decade-long effort to harness the power of the sea to create sustainable data centers, tapping the waves and water to power and cool armadas of cloud servers. It ties together three data center trends we’ve been tracking here at Data Center Frontier – ocean-based facilities, the emergence of edge computing and unmanned data centers.

Significantly, Microsoft kept Project Natick under wraps until it had been successfully deployed, placing it in a different league from previous visions of seagoing data centers from Google and others. This shifts the project beyond the hypothetical. The question is no longer whether it can be done, but the feasibility of the design’s economics, scale and use cases.

“The overall goal here is to deploy data centers at scale, anywhere in the world, from decision to power-on within 90 days,” said Ben Cutler, Microsoft’s Project Manager with the Natick team.

An Undersea Moonshot

Project Natick’s origin story dates to 2013 and a research paper from Microsoft’s Sean James proposing an underwater data center powered by renewable ocean energy. James had served in the Navy for three years.

“What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water,” said James. “It goes through a very rigorous testing and design process. So I knew there was a way to do that.”

In 2014 Microsoft Research created a team to explore the feasibility of the concept, which led to the creation of a submersible data container containing a single rack, dubbed the Leona Philpot (named for a character the popular Xbox game Halo, who broke her neck diving into a pool but became homecoming queen).

The 38,000 pound, 10 foot by 7 foot container was deployed last August 15 in about 30 feet of water off the coast of California. The Microsoft team sealed the servers inside the container, and monitored them as the data center operated for 105 days on the ocean floor. It was retrieved in November and trucked back to Microsoft headquarters in Redmond, Washingon for further analysis.

Free Resource from Data Center Frontier White Paper Library

data center migration
Migrate Your Data Center Worry-Free
In this new white paper, Flexential provides best practices for IT teams looking to optimize their data center migration by minimizing downtime and avoiding hurdles.
We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Get this PDF emailed to you.

Microsoft describes the initial voyage as “very successful,” with no leaks or hardware failures, allowing the researchers to extend the project and even run live workloads from its Azure cloud.

The Project Natick web site didn’t include any details on the power or cooling setup, but the New York Times noted that the Leona Philpot featured “a single data center computing rack that was bathed in pressurized nitrogen to efficiently remove heat from computing chips.”

Here’s a video from Microsoft that provides more information:

The Power of the Ocean

Microsoft’s effort is a new take on an old idea: using the sea to power and cool a data center, transforming both the economics and sustainability of cloud computing platforms. The concept dates to 2007, when Google gained a patent for a water-based data center, stirring visions of a fleet of futuristic offshore data havens powered and cooled by the waves. The company has never built the sea-going “Google Navy” described in its patents, but other companies have pursued the idea.

The most recent is Nautilus Data Centers, which has created a floating colocation facility on a barge moored on the San Francisco Bay. Nautilus says it has successfully tested a design for a floating data center that can dramatically slash the cost of running IT operations.

The primary advantage of a maritime data center is the ability to slash costs by using water to power or cool the data center, and avoiding the expense of real estate and property taxes. These ideas build on previous seagoing IT operations – both the U.S. Navy and major cruise lines have maintained sophisticated telecom and IT infrastructure for decades – and add power and cooling technologies that can slash costs.

Harnessing Wind and Waves

Microsoft’s Project Natick used the grid power for its prototype, but the company is working on next-generation designs featuring larger containers that can be powered with turbines powered by the waves or tides.

The Project Natick enclosure is lowered over a rack of Microsoft Azure cloud servers. (Photo: Microsoft Corp.)

The Project Natick enclosure is lowered over a rack of Microsoft Azure cloud servers. (Photo: Microsoft Corp.)

The company also sees the submarine data center project as an opportunity to rethink many of the form factors that have traditionally been used for servers and storage, which must account for the need for humans to access the equipment and replace components or refresh servers. Operating in full unmanned “lights out” mode allows new approaches that need account only for heat removal, rather than access.

Microsoft says the Natick containers are designed to operate as unmanned units submerged for up to five years at a time. An interesting wrinkle is that the company believes it may be able to go five years in production without refreshing its servers. Most hyperscale providers refresh their servers and processors every three years (as noted recently by research from Coolan).

“With the end of Moore’s Law, the cadence at which servers are refreshed with new and improved hardware in the datacenter is likely to slow significantly,” the Project Natick team said in its FAQ. “We see this as an opportunity to field long-lived, resilient datacenters that operate “lights out” – nobody on site – with very high reliability for the entire life of the deployment, possibly as long as 10 years.”

The Road Ahead

It’s easy to be skeptical about the potential for underwater data centers. There are many potential obstacles, including the long-term corrosive impact of sea water on the components.

But Microsoft now has data and a working prototype, as well as a history of converting these “moonshot” ideas into megascale data center operations. One of Sean James’ previous brainstorms plays a central role in this history.

In 2008, James and Microsoft colleague Christian Belady ran a single rack of servers in a tent in the fuel yard for one of the company’s data centers for seven months, with no equipment failures. The experiment proved that servers were hardier than believed, clearing the way for Microsoft to envision running containerized servers in the outdoors.[clickToTweet tweet=”Microsoft: The goal is to deploy data centers at scale, anywhere in the world, within 90 days.” quote=”Microsoft: The goal is to deploy data centers at scale, anywhere in the world, within 90 days.”]

Microsoft has since deployed tens of thousands of servers – and perhaps hundreds of thousands – in modular known as IT PACs that sit outdoors on cement pads on Microsoft data center campuses in Washington state, Iowa and southern Virginia.

Belady now heads Microsoft’s data center operations, and has advanced a number of innovative projects combining portable computing and renewable energy, like data plants and landfill-powered containers.

“The reality is that we always need to be pushing limits and try things out,” said Belady. “The learnings we get from this are invaluable and will in some way manifest into future designs.”

Microsoft says its still “early days” in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.

“This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,” says Norm Whitaker, who heads special projects for Microsoft Research NExT. “There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to data center partners. In a difficult situation, they could turn to this and use it.”

“It’s not a moonshot in the sense that it’s just this outlandish thing,” said Spencer Fowers, a researcher on the Natick team. “It’s actually a viable product that we could make.”

microsoft-natick-team-web

The Microsoft Project Natick team with the Leona Philpot underwater data center prototype. (Photo: Microsoft)

Is this genius or madness? What’s your opinion? Share your thoughts in the comment section.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Microsoft

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Comments

  1. jonathan.EVANS@luxconnect.lu'Jonathan Evans says

    February 2, 2016 at 2:35 am

    I was trying to think of the reasons for this development.. it wouldn’t flood (in the traditional way!) and it might survive an earthquake or a nuclear war!

  2. dsobczynski@optonline.net'Radek Sobczynski says

    February 3, 2016 at 12:06 pm

    FYI:

    Underwater data centers for Gigabit/s networks

    Submitted on August 12, 2012 Mozilla Ignite – 6month earlier than Microsoft claim

    The problem

    New Gb/s infrastracture will be power hungry – renewable sources of electricity lack connection to the grid.
    The solution

    The value of information delivered by fiber optics is higher than electrical current. Electrical energy from any size renewable source, instead of being transported through high-voltage lines can be used in-situ by data centers powering Gb/s networks. This concept applies to existing land based installation (solar parks) or maritime windmills or wave generators in coastal waters. Underwater data centers powered by maritime energy sources will be less susceptible to extreme weather conditions. Underwater placement solves the problem of cooling of the server blades.
    How will your idea make people’s live’s better?

    Gigawatts of energy will be saved thanks to elimination of losses in the electrical grid. This idea will help reduce meaningful fraction from [one cubic mile of burnt oil per year, plus few more of CMO’s from burning other fossil fuels (coal and NG).] used for energy generation. ” -Data centers have developed into major energy hogs. The world’s data centers are estimated to consume power equivalent to about seventeen 1,000 MW power plants, equaling more than 1% of total world electricity consumption, and to emit as much carbon dioxide as all of Argentina.” ( ref. J. M. Kaplan, W. Forrest, and N. Kindler. Revolutionizing data center energy efficiency. Technical report, McKinsey & Company, July 2008.)
    How does your idea take advantage of next-generation networks?

    Since about 44 percent of the world’s population lives within 100 miles of the sea (source http://coastalchallenges.com) Gb/s networks connected to underwater data centers will benefit from the electrical energy available from maritime renewable sources.

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Mitigate Risk, Improve Performance and Decrease Operating Expenses through Data Center Self-Performance

Mitigate Risk, Improve Performance and Decrease Operating Expenses through Data Center Self-Performance If a vendor conducts the actual work in your data center, then you or your operator aren’t maximizing your current operating resources and are experiencing incremental cost and risk. Chad Giddings of BCS Data Center Operations, explains the importance of your data center provider having a high-degree of self-performance.

White Papers

PCIe 6.0 test solutions

PCIe® 6.0: Testing for a New Generation

This white paper from Anritsu outlines the enhanced PCIe 6.0 technologies, such as PAM4, Forward Error Correction (FEC) and link equalization. It also provides guidelines on selecting the proper test system to verify PCIe 6.0 designs.

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

DCF Spotlight

Data center modules on display at the recent Edge Congress conference in Austin, Texas. (Photo: Rich Miller)

Edge Computing is Poised to Remake the Data Center Landscape

Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Pkaza Critical Facilities Recruiting

  • Electrical Commissioning Engineer - Los Angeles, CA
  • Data Center Construction Project Manager - Ashburn, VA
  • Critical Power Energy Manager - Data Center Development - Dallas, TX
  • Data Center Facilities Operations VP - Seattle, WA
  • Senior Electrical Engineer - Data Center - Dallas, TX

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Endeavor Business Media© 2022