PRINEVILLE, Ore. – When Facebook unveiled its first company-built data center on the high plains of Central Oregon in 2011, it was a first step into the larger world of hyperscale cloud campuses. A little more than five years later, the company’s infrastructure has spread far and wide, housing millions of servers across the globe.
As it approaches 2 billion users, Facebook is hitting a new phase of growth. The company just announced its eighth data center location in Odense, Denmark, giving the company five campuses in the United States and three in Europe. Each campus can support at least three data center buildings, each more than 1,000 feet long.
That growth is clearly visible in Oregon, where Facebook is working to complete construction on the third and largest data center building at its Prineville campus. The first data center in Prineville was 330,000 square feet, while the second building was slightly larger. The newest structure spans 450,000 square feet, about 35 percent larger than the original building.
The third Prineville building is “about 100 feet longer than the U.S.S. Abraham Lincoln,” said Ken Patchett, Facebook’s Director of Data Center Operations (West), as he surveyed work on the $250 million construction project during a tour last summer.
More Buildings, Data Halls
Once the new structure is complete, Facebook will have effectively run out of real estate. “We’re out of space here in Prineville,” said Patchett. “If we want to expand, we would need to get more land.”
That’s why, in addition to building in new places, Facebook is super-sizing its campuses. The buildings on its Altoona, Iowa campus are all larger than the biggest structure in Prineville. The company’s newest campus in Fort Worth, Texas will have room for five massive data center buildings, instead of three.
There are hints that the company may be thinking even bigger with its future campuses. Facebook has been identified as the company behind a project in Nebraska that has submitted plans to build four 610,000 square foot data centers. That suggests about 2.4 million square feet of space on one campus, more than twice the size of the 1.1 million square foot Prineville campus.
That growth can also be seen in Facebook’s capital spending, most of which is focused on building data centers and filling them with servers. Facebook said this week that it expects to spend $7.5 billion on capital expenses in 2017, up from $4.5 billion last year.
Video Alters the Storage Math
What’s driving this huge jump in infrastructure spending? Facebook’s continuing growth is part of the story. But so is the evolution of its business, which is increasingly focused on video.
“I see video as a megatrend on the same order as mobile,” Facebook founder and CEO Mark Zuckerberg said on a Feb. 1 earnings call. “That’s why we’re going to keep putting video first across our family of apps and making it easier for people to capture and share video in new ways.”
“We’re seeing consumer video exploding on our platform, and that really creates the opportunity for video ads in the feed,” added Sheryl Sandberg, the Chief Operating Officer of Facebook.
The emphasis on video shifts the math on file storage and data center requirements, as HD video files are substantially larger than photos. Facebook has been scaling up its infrastructure to handle massive growth in user photo uploads, including custom cold storage facilities and the use of BluRay disks to save energy on long-term storage. Video storage can be an even larger and more expensive challenge. Google, which operates YouTube as well as a cloud platform, spends more than $10 billion a year on data center infrastructure.
Slightly further out on the horizon is virtual reality (VR). Facebook’s Oculus VR technology plays a key role in Zuckerberg’s vision for the future of Facebook and social media.
This has infrastructure implications. VR and 360-degree video applications require a LOT of data, and delivering these experiences across the Internet presents a major challenge. Virtual reality content could be 5 to 20 times the size of today’s full HD video.
All that data will have to move across the network, and in some cases be cached locally to assure low latency.
User Growth Drives Site Selection
That’s why Facebook is building bigger cloud campuses, and building them in new places. Facebook’s expansion program is driven by the shifting geography of its user demand, according to Dan Madrigal, director of real estate acquisitions for Facebook.
“We have to anticipate years in advance where our platforms will grow,” said Madrigal. “That’s a really hard thing to solve for.”
Here’s a look at Facebook’s data center campuses around the world:
Clonee, Ireland: In this western suburb of Dublin, Facebook has begun construction on its first two buildings. The Clonee design is distinctive, with the two data center buildings connected by a central building that will house the admin staff, presumably providing additional square footage for data halls in the main buildings. Between them, the structures will span 621,000 square feet. Brookfield Renewable Ireland will supply the Clonee data center with renewable energy, primarily from wind power.
Where will Facebook build next? There are clues emerging that it may be Sarpey, Nebraska, where local officials have been negotiating with a large Internet company seeking to build a data center project on a 146-acre property. Some sleuthing by DataCenterDynamics found that the LLC pursuing the project is registered at the same address as Facebook’s headquarters in Menlo Park, Ca.
The site is about 160 miles from Facebook’s Altoona campus, which would reinforce a trend toward huge cloud campuses in the center of the country. Google’s largest data center campus is in Council Bluffs, Iowa, while Microsoft has three data center campuses clustered around West Des Mointes, Iowa. Amazon Web Services is building three data center campuses in Ohio.
After that? Facebook has no data centers in South America, Asia or Africa, leaving many new frontiers where infrastructure may soon be needed to support global user growth.