Facebook has become a major investor in high-capacity fiber optic routes to move data traffic between its data centers, including both undersea cables and terrestrial routes. The company will now begin selling unused capacity to other companies, effectively entering the wholesale fiber business.
Facebook has created a subsidiary called Middle Mile Infrastructure that will sell excess capacity on two new fiber routes that company is building to provide direct high-speed connectivity between its data center campuses in Virginia, Ohio and North Carolina.
“We intend to allow third parties — including local and regional providers — to purchase excess capacity on our fiber,” said Kevin Salvadori, Director of Network Investments at Facebook. “This capacity could provide additional network infrastructure to existing and emerging providers, helping them extend service to many parts of the country, and particularly in underserved rural areas near our long-haul fiber builds.”
The company said it will not be providing services directly to consumers, but wants to support the carriers and operators. Facebook will reserve a portion for our own use and make the excess available to others.
Welcomed in West Virginia
One of the new fiber routes will connect the Facebook data center cluster in Ashburn, Virginia with its new campus in New Albany, Ohio. Along the way, Facebook hopes to sell capacity in West Virginia, where 275 miles of the fiber route will run through the state. The construction, which is planned to begin this year and last for roughly 18 to 24 months, will allow broadband providers to expand middle-mile networks into communities along the route.
“Access to broadband internet drives economic growth and opportunity, but there are still too many unserved communities, including here in West Virginia,” Salvadori said yesterday at a press conference in Charleston, W.Va. “We see the need for long haul fiber as an opportunity to provide critical infrastructure where it did not previously exist. To that end, we’ve designed our project to attract potential local and regional providers to expand broadband internet access for the communities surrounding our builds.”
“Broadband development is absolutely critical to moving West Virginia forward,” said West Virginia Gov. Jim Justice. “An investment of this magnitude in our state is really big news and will help us continue to show the world how great West Virginia truly is.”
Boosting connectivity and fiber access in rural areas is difficult, and comes with challenging economics. Buying rights-of-way and digging trenches for fiber is expensive, and predicting the timing of new businesses and revenue growth is challenging, as experienced by providers that have pursued this strategy. That includes Allied Fiber, which sought to create Internet on-ramps and off-ramps everywhere, connecting small data centers to dark fiber to bring the core of the Internet closer to the edge.
Facebook is also taking a different approach to community fiber development than Google, which focused its Google Fiber business on providing broadband access in second-tier cities. Google Fiber offers service in 19 cities, but has scaled back its expansion plans, and had some infrastructure snafus, including an attempt to use shallow trenching in its Louisville project. The company recently abandoned the project after the new trench design didn’t meet expectations.
Hyperscalers Investing in the Network
Facebook’s entry into the wholesale fiber market is part of a larger trend in which Facebook and other hyperscale computing players are investing directly in fiber infrastructure, seeking to better manage the immense flow of traffic between their data center campuses.
Free Resource from Data Center Frontier White Paper Library
As hyperscale companies like Facebook continue to grow, the huge volumes of user data prompt them to add data centers. This leads to larger cloud campuses, with truly massive volumes of data moving between them. The volume of this “East-West” traffic between data centers far surpasses the volume of data traveling to other campuses and the Internet – known as “North-South” traffic.
The trend began about a decade ago when Google began buying dark fiber routes between its facilities, but the recent growth of cloud computing and social media has boosted these companies’ appetite for bandwidth. As a result, the “big four” hyperscale players – Google, Facebook, Microsoft and Amazon – have become the largest investors in new subsea cable routes and are seeking more dark fiber to create redundant connectivity between their data centers.
Facebook began building its own fiber routes in 2017, when it announced plans to build a 200-mile cable to connect a new campus in Las Lunas, New Mexico to another campus in Fort Worth, Texas. The company says this underground cable is now one of the highest-capacity systems in the United States, with state-of-the-art optical fiber that handles data more efficiently than other backbone networks.
Efficiently enough that Facebook now believes it has capacity to spare, and can monetize the excess capacity.
Multiple Infrastructure Initiatives
In a blog post announcing its wholesale fiber subsidiary, Salvadori noted that the initiative is part of Facebook’s long-term effort to invest in hardware, software and infrastructure to supports its operations. The company has also sought to share the benefits with the broader industry through open hardware initiatives like the Open Compute Project and Telecom Infra Project.
These hardware initiatives include last year’s introduction of the Fabric Aggregator , custom hardware to manage the massive data traffic coursing through Facebook’s cloud campuses has outstripped the capabilities of commercial networking hardware.
Since the beginning of 2017, Facebook has accelerated its data center expansion, announcing five new cloud campuses. The social network now has 12 data center campuses around the globe, including nine in the U.S. and three in international markets. These campuses have been getting larger, as Facebook now creates up to eight data centers and 3 million square feet of data center space per campus.
Facebook has been scaling up its infrastructure to handle massive growth in user photo uploads, including custom cold storage facilities and the use of BluRay disks to save energy on long-term storage. Video storage can be an even larger and more expensive storage challenge, as HD video files are substantially larger than photos. This has infrastructure implications. VR and 360-degree video applications require a LOT of data, and delivering these experiences across the Internet presents a major challenge. Virtual reality content could be 5 to 20 times the size of today’s full HD video.
On the network side, Facebook created a new architecture in 2017, featuring an Express Backbone (EBB) dedicated network to manage the huge data flows of machine-to-machine M2M traffic between its facilities. The company continues to use its classic backbone (CBB) to deliver status updates and photos to its users.