Facebook Plans Colossal 11-Story Data Center in Singapore

Sept. 6, 2018
Facebook is expanding its data center network to Asia with a colossal 11-story, 1.8 million square foot facility that will be one of the largest data center structures ever built.

Facebook is expanding its data center network to Asia, and doing it in dramatic fashion, with a colossal 11-story, 1.8 million square foot facility that will be one of the largest data center structures ever built.

The new $1 billion project was disclosed Thursday morning by Facebook engineering executive Jay Parikh, and reinforces several trends we’ve been tracking here at Data Center Frontier, including the super-sizing of hyperscale data center requirements, and the trend toward multi-story facilities to enable new capacity in constricted real estate markets. It also continues a massive expansion of Facebook’s network as its seeks to keep pace with the growing use of video, artificial intelligence and other data-intensive technologies.

Facebook’s design represents a startling new approach to deploy data center capacity vertically. It’s the tallest facility yet that uses a true hyperscale design, which features larger facilities that capture economies of scale.

“Singapore is one of the most vibrant and modern technology hubs in Asia,” Parikh wrote. “However, it presents a new set of efficiency challenges due to its high temperatures and humidity. To address these and other unique operational requirements, including building in a dense, urban environment, we came up with a new design and way to build this facility.”

Singapore: Home of the High-Rise Data Centers

The key challenge in Singapore is land. The large parcels Facebook usually seeks to build its multi-building cloud campuses are largely unavailable in Singapore, an island nation where land is extremely expensive.The government of Singapore has been encouraging data center operators to build taller facilities.  That’s why other data center companies in Singapore have employed a multi-story design. Google operates a five-story data center in Singapore, which is its tallest data center anywhere in the world. Digital Realty also operates several multi-story data centers in Singapore, while SingTel has a seven-story facility.

But none approach the 11-story design disclosed today by Facebook. Images of the building show cooling infrastructure housed on the roof, an approach supported by the new cooling system. The main data center appears to have an attached support building that is about five stories tall, which could house backup generators.

There have been taller buildings that house data centers, including the major carrier hotels, like 60 Hudson Street in New York and One Wilshire in Los Angeles, which provide services to the nexus of business customers in the central business district of these cities. These carrier hotels were originally office buildings, and have limited floor plates and restrictions on mechanical and electrical equipment.

Among purpose-built data centers, Google began building taller data centers in 2016, enabling it to pack more servers into the same real estate footprint, providing more bang for its buck on each of its huge cloud campuses. Colocation provider Equinix recently opened a dedicated eight-story data center facility in Amsterdam, one of the most active data center markets in Europe. There is also a strong trend toward multi-story data centers in two of the leading U.S. Internet hubs, Northern Virginia and Silicon Valley.

Facebook has traditionally built built data centers that house servers on a single story, with a second-floor “penthouse” dedicated to cooling systems. The Singapore design adapts a large-footprint hyperscale design and adds multiple stories. This approach often creates challenges in deploying cabling for power and fiber, typically using dedicated vertical risers.

New Cooling Design for Warm Weather

A key component of Facebook’s design is a new cooling system that the company unveiled in June, which allows it to cool its servers and storage equipment in warmer climates. The new system, known as StatePoint liquid cooling system (SPLC), was developed in partnership with cooling specialist Nortek Inc., and uses an approach to evaporative cooling that is new to the data center industry.

As DCF noted at the time, his type of system could be useful in Asia and other humid climates, where data centers often operate slightly less efficiently due to the use of more power-intensive cooling and dehumidification systems.

In most of its data centers, Facebook uses direct cooling, bringing filtered outside air into the data hall and circulating it through racks to remove the heat generated by servers and storage units. SPLC takes a different approach, using the outside air temperature to produce cool water, which can then be used in cooling systems. Facebook is currently using the water in a cooling coil, which cools air that flows through the racks to cool servers.

SPLC is an evaporative cooling system uses a liquid-to-air energy exchanger, in which water is cooled as it evaporates through a membrane separation layer. This cold water is then used to cool the air inside the data center.

The new system will use far less water than other indirect cooling technologies, addressing growing concerns about the use of water by hyperscale data centers, and their potential impact upon the local water resources and the environment.

“This technology minimizes water and power consumption and can maintain required temperatures without supplemental cooling,” Parikh wrote. “It can reduce the amount of water used by 20% in hot and humid climates like Singapore when compared to other indirect cooling systems. With an expected PUE of 1.19, we expect this facility to be one of the most efficient data centers in the region.”

Now Building Bigger, and Bigger…

The Singapore project also continues a trend in which Facebook is super-sizing the scale of Internet infrastructure, building bigger data centers, and larger cloud campuses. We first reported on this trend in February 2017, and the company’s construction program has accelerating relentlessly ever since.

What’s driving this growth. Facebook’s growing focus on video shifts the math on file storage and data center requirements, as HD video files are substantially larger than photos. Facebook has been scaling up its infrastructure to handle massive growth in user photo uploads, including custom cold storage facilities and the use of BluRay disks to save energy on long-term storage. Video storage can be an even larger and more expensive challenge. Google, which operates YouTube as well as a cloud platform, spends more than $10 billion a year on data center infrastructure.

It also was inevitable that Facebook would require data center infrastructure in Asia, which has been one of the fastest-growing regions for Internet use, and thus a hotbed of data center activity.

“We’re very excited about expanding our data center footprint into Asia, and it becoming part of our highly advanced infrastructure that helps bring Facebook apps and services to you every day,” said Parikh.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Optimizing AI Infrastructure: The Critical Role of Liquid Cooling

In this executive brief, we discuss the growing need for liquid cooling in data centers due to the increasing power demands of AI and high-performance computing. Discover how ...

AI-Driven Data Centers: Revolutionizing Decarbonization Strategies

AI hype has put data centers in the spotlight, sparking concerns over energy use—but they’re also key to a greener future. With renewable power and cutting-edge cooling, data ...

Bending the Energy Curve: Decoupling Digitalization Trends from Data Center Energy Growth

After a decade of stability, data center energy consumption is now set to surge—but can we change the trajectory? Discover how small efficiency gains could cut energy growth by...

AI Reference Designs to Enable Adoption: A Collaboration Between Schneider Electric and NVIDIA

Traditional data center power, cooling, and racks aren’t sufficient for GPU-based servers arranged in high-density AI clusters...

Anggalih Prasetya/Shutterstock.com
Source: Anggalih Prasetya/Shutterstock.com

Powering AI Innovation: The Role of Data Centers in Supporting AI Workloads

As AI continues to transform industries and drive business innovation, the role of data centers in supporting this revolution cannot be overstated. Doug Adams, CEO of NTT Global...

White Papers

Download the full report.

PCIe® 6.0: Testing for a New Generation

Aug. 1, 2021
This white paper from Anritsu outlines the enhanced PCIe 6.0 technologies, such as PAM4, Forward Error Correction (FEC) and link equalization. It also provides guidelines on selecting...