• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Cooling / Nautilus Data Barges Ahead With Floating Data Center

Nautilus Data Barges Ahead With Floating Data Center

By Rich Miller - April 11, 2017

Nautilus Data Barges Ahead With Floating Data Center

An illustration of the Nautilus Data Technologies data barge,. (Image: Nautilus)

LinkedinTwitterFacebookSubscribe
Mail

VALLEJO, Calif. – As he surveys the open area before him, more than 230 feet long and 55 feet wide, Arnold Magcale sees the future home of an innovative data center. For the moment, it’s the deck of a barge.

Magcale is the founder and Chief Technology Officer of Nautilus Data Technologies, which aims to tap rivers, lakes and oceans to slash the cost of cooling servers. Here at Mare Island, a former U.S. Navy shipyard, is where the transformation from barge to data center will begin.

Nautilus says it has lined up customers and intends to bring its initial facility online late this year. The company says its concept for a 6 megawatt floating data center will be cheaper and more efficient than traditional land-based facilities.

Nautilus is the latest developer to pursue a water-based data center, a novel concept that has made headlines since Google first floated the idea in a 2007 patent filing, and was spotlighted by Microsoft’s test of an undersea data center last year.

Industry experts have debated whether a water-based data center is sheer brilliance or total madness. The primary benefit is the ability to slash costs by using water to cool the IT equipment. But there are drawbacks, and many customers may be reluctant to place expensive IT equipment on the water.

Water Cooling Drives Design

The Nautilus team says it is more convinced than ever that this approach can lead to cheaper, more efficient data centers. But the company’s path to market has been proceeding at a barge-like pace.

It’s been nearly two years since our last update on Nautilus Data and its efforts to bring its concept to market. At the time, the company was positioning its technology as an ideal solution for California’s water crisis, and predicted it would bring a data center online by the end of 2015.

“Because this is new, we wanted to take our time,” said Magcale, who said the company has required additional time to complete maritime engineering.

California’s drought has now officially ended, and the new projected delivery date is December 2017. Much has happened in that time, including Microsoft’s revelation of a working prototype of an underwater data center, running servers 30 feet beneath the ocean’s surface.

The Nautilus team says Microsoft’s Project Natick validated its core thesis of tapping a body of water for data center cooling. By using a rear door cooling unit, the company says it can support unusual density, allowing customers to get more data center per square foot of real estate.

Free Resource from Data Center Frontier White Paper Library

data center solutions
Data Center Solutions: Reducing the Risk of Change
While many data center professionals are comfortable with the status quo, new solutions are required to address the rapid changes in the industry and to reduce the risk of evolving with the times. Future Facilities explores how new data center solutions can address the challenges today's data center professionals are facing.
We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Get this PDF emailed to you.

The Nautilus Design

The focal point for Nautilus development effort is Mare Island, a peninsula between the Napa River and San Pablo Bay. The U.S. Navy once built and repaired nuclear submarines at this huge base, which employed 40,000 workers during World War II. The Navy shipyard was shuttered in 1996, but the area is being redeveloped as a business park.

Today Mare Island is home to the Eli M, the barge that will serve as the first Nautilus data center. It resides at a shipyard operated by Lind Marine, a barge specialist that operates a full dry dock where the data barge will be assembled. Not far away is a facility that manufactures pre-fabricated structures, which will build the Nautilus data center modules. Large cranes at the Lind dry dock will hoist them onto the vessel to create the “data deck.”

A row of cabinets using the ColdLogik rear door heat exchanger that Nautilus Data technologies will use on its barge-based floating data center. (Image: Nautilus Data)

A row of cabinets using the ColdLogik rear door heat exchanger that Nautilus Data technologies will use on its barge-based floating data center. (Image: Nautilus Data)

Mare Island was also home to the company’s proof-of-concept, which was built in 2015 on a 100-foot ship. The five-rack project in 2014 featured equipment from the U.S. Navy, Applied Materials and Veolia, and was validated by Jacobs Engineering and Critical Engineering Group. Racks of IT equipment are cooled by rear-door chilled water units from U.K. vendor ColdLogik, which can support workloads of up to 36 kilowatts (kW) per rack.

In the Nautilus multi-tenant design, modular data halls on the deck will house cabinets of servers. Mechanical and electrical equipment, including UPS units and cooling distribution units, are located below deck in a water-tight hold.

The cooling system features two separate piping loops and a heat exchanger. Cool water from the river will enter through an intake several feet below the barge. The water will be filtered to remove fish and any other contaminants, and then move to the heat exchanger. A fresh water cooling loop on the other side of the heat exchanger feeds the rear-door cooling systems on the racks.

“We’re a heat sink,” said Magcale. “We’re sitting on top of that body of water. We evaporate no water, and emit no green house gases.”

Targeting High-Density Workloads

Magcale believes the ability to cool high-density racks will be a strong selling point for the Nautilus solution, as well as the ability to place a high-density 30kW rack next to a traditional enterprise 5kW rack. Having proven the solution at 32kW a rack, Magcale says his team is now testing it with workloads up to 75kW a rack, and perhaps higher.

“The buzzword now is Internet of Things and the Internet of Everything,” he said. “People’s behaviors are driving this trend. That will require a lot of compute, data crunching and storage. That’s when you see the density kick in.”

By eliminating chillers and cooling towers, Nautilus can reduce its capital expenditures, which translates customer savings of up to 30 percent versus a traditional data center, Nautlius says. In testing, the system operated with a Power Usage Effectiveness (PUE) of about 1.15.

The cooling loops interact with a heat exchanger from Sondex Holdings, and use pumps and the Venturi effect to boost flow and create a negative pressure within the piping, which will reduce the risk of damage to IT equipment from leaks in overheads pipes or hoses. Nautilus has worked with a fabricator to create skid-mounted cooling units in 500kW, one megawatt and two megawatt capacities, according to Byron Taylor, Director of Mission Critical Operations at Nautilus.

An illustration of the two-deck configuration of the Nautilus Data Technologies design, which places modular data centers on a floating barge. (Image: Nautilus)

An illustration of the two-deck configuration of the Nautilus Data Technologies design, which places modular data centers on a floating barge. (Image: Nautilus)

The water intake system uses copper plating and titanium piping to limit the impact of salt water on equipment. Nautilus says it has worked closely with the Navy to address the humidity and condensation issues that arise on a floating vessel. The company has also customized its chemical treatment of water. “You don’t want barnacles growing inside your heat exchanger,” said Taylor.

The water intake is located at one end of the boat, with the discharge at the other end to avoid thermal mixing. State regulations require that the discharge temperature is less than four degrees warmer than the intake, and Taylor said Nautilus has met that standard with room to spare. Even so, Taylor said that working with state environmental department to ensure water quality has “has occupied a lot of time and focus.”

Magcale says meeting the most stringent requirements up front will be a long-term benefit. “From an environmental perspective, if we can do it here in California, we can do it anywhere in the world,” said Magcale.

Shifting Cargo From Gravel to Data

The new Nautilus data center has seen the world. The barge, named the Eli M, previously hauled sand and gravel from the U.S. mainland to Hawaii and back. To prepare for its new mission, a new deck will be created from four inches of poured concrete. The vessel can haul 3,500 tons of cargo, but Nautilus estimates that a fully-loaded data center would be less than 3,200 tons.

“We have both enterprise and government customers committed,” said Chad Romine, the VP of Business Development for Nautilus.

A key question in the data center business is the ability to scale and deliver new inventory as customers require. The Nautilus concept has been under development throughout a period of rapid growth for data center service providers. if Nautilus succeeds, is a barge-driven design easy to repeat and scale?[clickToTweet tweet=”Nautilus CTO Arnold Magcale: If we can do it here in California, we can do it anywhere in the world.” quote=”Nautilus CTO Arnold Magcale: If we can do it here in California, we can do it anywhere in the world.”]

Magcale said there is significant availability of barges that can be converted to data center use. “There’s a big surplus because there were a lot of barges that hauled steel from China for the Bay Bridge,” said Magcale. “They don’t want to haul the barges back to China. You have to tug it, with a full crew, all the way across the Pacific. So they sell them here.”

Romine said there are scenarios where barges could offer some speed-to-market advantages over land-based data centers. “On a barge, the permitting is a fraction of what it is in land,” said Romine. “We can mass produce these now. Our ship builder can build three of these at a time.”

But first, there’s the task of getting the first data center built and launched. Then the Nautilus team can look at the deck of their barge as a platform to bigger things.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Floating Data Center, Nautilus Data

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Comments

  1. katsunori.toda@daikinapplied.com'Katsu Toda says

    July 23, 2018 at 2:57 pm

    This is a great idea to store the data in the water.

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Mitigate Risk, Improve Performance and Decrease Operating Expenses through Data Center Self-Performance

Mitigate Risk, Improve Performance and Decrease Operating Expenses through Data Center Self-Performance If a vendor conducts the actual work in your data center, then you or your operator aren’t maximizing your current operating resources and are experiencing incremental cost and risk. Chad Giddings of BCS Data Center Operations, explains the importance of your data center provider having a high-degree of self-performance.

White Papers

digital modernization

Cloud and the Data Center: How Digital Modernization is Impacting Physical Modular Infrastructure

The pandemic ushered in a digital boom in 2020, driving data center digital modernization efforts at a rapid pace. A new special report from ABB explores how physical infrastructure, the data center, and the cloud are keeping up with new modular solutions delivery and streamlined operational support.

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

DCF Spotlight

Data center modules on display at the recent Edge Congress conference in Austin, Texas. (Photo: Rich Miller)

Edge Computing is Poised to Remake the Data Center Landscape

Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Pkaza Critical Facilities Recruiting

  • Electrical Commissioning Engineer - Los Angeles, CA
  • Data Center Construction Project Manager - Ashburn, VA
  • Critical Power Energy Manager - Data Center Development - Dallas, TX
  • Data Center Facilities Operations VP - Seattle, WA
  • Senior Electrical Engineer - Data Center - Dallas, TX

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Endeavor Business Media© 2022