The world’s boldest edge computing project has taken a major step forward. Microsoft revealed today that it has deployed the second phase of Project Natick, its undersea data center prototype, in the waters off Scotland.
Microsoft’s 40-foot container houses 864 servers in 12 racks, resting on a seabed 117 feet below the ocean’s surface off the coast of the Orkney Islands, a chain of Islands in northernmost Scotland. The data center is unmanned, and powered by renewable energy through a cable that runs several hundred yards to land, which also provides network connectivity.
Project Natick represents a radical new approach to deploying data center capacity, which could enable Microsoft to shift its factory-built modular designs from earth to sea. In an era of exciting advances in data center design, Microsoft’s experiment seeks to extend the frontiers of edge computing, bringing cloud capacity closer to population centers concentrated along coastlines around the world.
“Our vision is to be able to deploy compute rapidly anywhere on the planet as needed by our customers,” said Christian Belady, general manager of cloud infrastructure for Microsoft.
It’s also an indicator that the future of cloud computing could look a lot different from its present, with an array of designs and deployment models from the core of the network to the edge.
The deployment in Scotland demonstrates that Microsoft is serious about deploying underwater data centers as part of its strategy for edge computing, which brings data and applications closer to end users. When Microsoft deployed its initial Project Natick module in 2015 off the coast of California, many in the data center industry dismissed it as a science experiment – an impressive accomplishment, but not a candidate for production workloads at scale.
A New Wave of Data Centers?
The Scotland deployment indicates Microsoft is ready to invest in the concept. It’s a larger implementation than the initial undersea deployment, and the next step in a development process the company outlined last year, which could culminate in pods of modules being aggregated to create a server farm of up to 20 megawatts or more.
The new Project Natick module uses about 250kW of power, slightly more than terrestrial edge computing modules rolled out by several edge computing startups. Microsoft says the Orkney module will run for a year, and may handle customer workloads.
“Like any new car, we will kick the tires and run the engine in different speeds to make sure everything works well,” said Spencer Fowers, a senior member of technical staff for Microsoft’s special projects research group. “Then, once we are completely ready to go, we will grab one or two of our clients and hand them over the keys and let them start deploying jobs onto our system.”
Microsoft says “could herald a new wave of data centers that can be deployed rapidly and inexpensively while increasing data speeds along coastal regions.” The company notes that more than half of the world’s population lives within about 120 miles of the coast.
“For true delivery of AI, we are really cloud dependent today,” said Peter Lee, corporate vice president of Microsoft AI and Research. “If we can be within one internet hop of everyone, then it not only benefits our products, but also the products our customers serve.”
Microsoft’s experiment continues the data center industry’s decade-long effort to harness the power of the sea to create sustainable data centers, tapping the waves and water to power and cool armadas of cloud servers. It ties together three data center trends we’ve been tracking here at Data Center Frontier – ocean-based facilities, the emergence of edge computing and unmanned data centers.
Submarine Expertise On Board
On the second phase, Microsoft worked with French company Naval Group, a specialist in submarine and maritime engineering. A key change from the prototype was in the cooling system, where Naval Group adapted a heat-exchange process commonly used for cooling submarines, piping seawater directly through the radiators on the back of each of the 12 server racks and back out into the ocean.
One design specification was that the vessel should be the same dimension as a standard cargo container used to move supplies on ships, trains and truck. The data center was assembled and tested in France and shipped on a flatbed truck to Scotland where it was attached to a ballast-filled triangular base for deployment on the seabed.
The Natick module was towed out to sea partially submerged and cradled by winches and cranes between the pontoons of an industrial catamaran-like gantry barge. At the deployment site, a remotely operated vehicle retrieved a cable containing the fiber optic and power wiring from the seafloor and brought it to the surface where it was checked, attached to the data center, and powered on.
The site was selected for its ample supply of renewable energy sources, including solar and wind energy as well as wave power. The deployment was made adjacent to the Orkney is home to the European Marine Energy Centre, a test site for tidal turbines and wave energy converters that generate electricity from the movement of seawater. Tidal currents there travel up to nine miles per hour at peak intensity and the sea surface regularly roils with 10-foot waves that whip up to more than 60 feet in stormy conditions.
Here’s a video from Microsoft showing the deployment:
While the sustainability profile is excellent, Microsoft is perhaps more focused on the potential business benefits of Project Natick. These include:
- Improved speed and agility, with the ability to deploy within 90 days
- A lower total cost of ownership (TCO) than land-based data centers.
- Low-latency delivery of cloud services to the large populations along coastlines.
- Better physical security than land-based data centers.
Another benefit is the ability to operate unmanned in a “light out” environment, as Microsoft already does with some of its IT-PAC data center modules.
Project Natick’s Unlikely Journey
Project Natick’s origin story dates to 2013 and a research paper from Microsoft’s Sean James proposing an underwater data center powered by renewable ocean energy. James had served in the Navy for three years.
In 2014 Microsoft Research created a team to explore the feasibility of the concept, which led to the creation of a submersible data container containing a single rack. Microsoft’s effort is a new take on an old idea: using the sea to power and cool a data center, transforming both the economics and sustainability of cloud computing platforms. The concept dates to 2007, when Google gained a patent for a water-based data center, stirring visions of a fleet of futuristic offshore data havens powered and cooled by the waves. The company has never built the sea-going “Google Navy” described in its patents, but other companies have pursued the idea.
The most recent is Nautilus Data Centers, which has created a floating colocation facility on a barge moored at Vallejo, California. Nautilus says it has successfully tested a design for a floating data center that can dramatically slash the cost of running IT operations.
The primary advantage of a maritime data center is the ability to slash costs by using water to power or cool the data center, and avoiding the expense of real estate and property taxes. These ideas build on previous seagoing IT operations – both the U.S. Navy and major cruise lines have maintained sophisticated telecom and IT infrastructure for decades – and add power and cooling technologies that can slash costs.
Microsoft says the Natick containers are designed to operate as unmanned units submerged for up to five years at a time. An interesting wrinkle is that the company believes it may be able to go five years in production without refreshing its servers. Most hyperscale providers refresh their servers and processors every three years.
Explore the evolving world of edge computing further through Data Center Frontier’s special report series and ongoing coverage.