FOREST CITY, N.C. – The temperature remains constant as you walk through Facebook’s custom data storage facility. But as you approach the back of the room, you transition from cold storage to even colder storage.
In a row of 14 racks housing square enclosures, Facebook is test-driving the future of its long-term data storage. The racks are packed with thousands upon thousands of Blu-Ray disks.
That’s right: the optical media that plays your movies can now back up all your status updates and Facebook photos.
The Blu-Ray storage system is part of the company’s evolving effort to manage a flood of incoming data, with users uploading more than 900 million photos every day. To keep pace with all those uploads, Facebook must constantly seek new ways to add capacity.
“It’s amazing how much storage you can do with Blu-Ray,” said Keven McCammon, Datacenter Manager for Facebook’s facility in western North Carolina. “There are times when you can look back to look forward.”
The racks of Blu-Ray storage are still in the testing phase. But Facebook has high hopes for Blu-Ray as a tool to optimize its infrastructure. McCammon and his team are putting the system through its paces at the company’s massive East Coast data center campus.
“We’re doing some pretty extensive testing right now,” said McCammon. “We want to really make sure it can function in a production environment and can scale.”
With 1.44 billion users, Facebook’s storage needs may seem otherworldly to most users. But many companies will soon face similar challenges managing the explosive growth of data storage. The hyperscale data center players pioneer the design strategies and best practices that will soon filter into many other data centers. Facebook’s cold storage journey offers insights for managing the coming data tsunami.
Retooling Tiered Storage for the Hyperscale Age
Facebook’s cold storage system is a web-scale implementation of tiered storage, a strategy that organizes stored data into categories based on their priority and then assigns the data to different types of storage media to reduce costs. The goal is to create a top tier consisting of high-performance enterprise hardware and networking, while lower tiers can use commodity hardware or, for rarely-used assets, backup tape libraries.
The storage world has changed since tiered storage made its debut in 1990, but Facebook is applying many of the principles in its infrastructure, albeit with different technologies.
Free Resource from Data Center Frontier White Paper Library
Facebook was an early adopter of solid state drives (SSDs), storage devices that use integrated circuits as memory. SSDs have no moving parts, unlike traditional hard disk drives (HDDs), which contain spinning disks and moveable read/write heads. Most importantly, SSDs are faster than hard disks, and can accelerate key parts of Facebook’s infrastructure.
While focusing on SSD and Flash in the high-performance portions of its storage infrastructure, Facebook continues to use plenty of hard disk drives to store photos. In 2012 it created a custom storage tray through the Open Compute Project, known as Knox.
By 2013, Facebook was storing more than an exabyte of images that were rarely accessed, with 82 percent of traffic focused on just 8 percent of photos. So the company created a “cold storage” design to shift these rarely-viewed photos from its expensive high-performance server farms to simpler data centers with no generators or UPS (uninterruptible power supply). Emergency backup power isn’t needed because the facility does not serve live production data.
“Reducing operating power was a goal from the beginning,” wrote Facebook’s Krish Bandaru and Kestutis Patiejunas in a blog post outlining the cold storage system. “So, we built a new facility that used a relatively low amount of power but had lots of floor space. The data centers are equipped with less than one-sixth of the power available to our traditional data centers, and, when fully loaded, can support up to one exabyte (1,000 PB) per data hall.”
Facebook operates cold storage facilities at its Prineville, Oregon and North Carolina data center campuses. Custom software optimizes the energy use of trays filled with commodity HDDs. By reducing the disk activity and power draw, the design has slashed the amount of airflow needed to cool the racks.
Facebook wasn’t done yet. At the 2014 Open Compute Summit, it unveiled a prototype for an “ultra-cold” storage system using Blu-Ray disks as the storage medium and a robotic arm to retrieve data. The robotic arm is similar in concept to systems used to retrieve tape cartridges in tape storage libraries. (See the prototype in action in this video from Facebook).
Blu-Ray is an optical data storage format that uses blue lasers to read the disk. It’s not for primary storage, as data can’t be retrieved instantly. But using Blu-Ray disks offers savings of up to 50 percent compared with the first generation of Facebook’s cold storage design, since the Blu-Ray cabinet only uses energy when it is writing data during the initial data “burn,” and doesn’t use energy when it is idle.
How Design Tweaks Drive Big Savings
Facebook has built three cold storage data centers on the Forest City campus. Only one is in use, with the other two reserved for future capacity. The single-story buildings are a departure from the company’s primary server farms, which have an upper floor that functions as a huge cooling plenum, treating fresh air to be used to cool servers in the data halls on the first floor.
In the cold storage design, air enters the facility through louvers in the side of the building. The cooling is handled by a series of air handlers along the exterior wall, which consolidate the multi-step cooling and filtering into a single piece of equipment.
The cold storage racks are more densely packed than a standard Open Compute Knox storage units. A cold storage rack can hold 32 trays, which each contain 15 standard 4 terabyte drives, allowing Facebook to store nearly 2 petabytes of data in every rack.
“We spin them up, fill them with data and then spin them down. Every now and again we spin them up again to make sure they’re working fine,” said McCammon, who noted the design “doesn’t draw as much power, and you don’t need as much air. In the other data hall, the drives are always going.”
The Blu-Ray racks are even more efficient. They are different in appearance from the Knox storage trays, with five units per Open Rack, each housing carousels filled with Blu-Ray disks. When data must be accessed, the action happens in the rear of the unit. That’s where the robotic retrieval system has been condensed into a space in the bottom of the rack. When it springs into action, the arm travels along tracks on either side of the rack, fetches a Blu-Ray disk, pulls data off the disc and writes it to a live server.
“In a BluRay scenario, you don’t even need to spin them up,” said McCammon. “It also takes less power to perform a write. It also reduces our network traffic, and so we need fewer switches.”
“It’s incredibly efficient,” he added. “It’s truly lights out.”
Facebook isn’t alone in tapping Blu-Ray as a medium for data storage archives. It’s a concept that dates to optical jukebox systems, and companies like Hie Electronics and DISC Group offer smaller scale data archives using Blu-Ray and robot retrieval.
Through the Open Compute Project, the Facebook Blu-Ray initiative has created a commercial spinoff of its own. Last year Facebook hardware guru Frank Frankovsky left to found a startup focused on adapting the Blu-Ray optical storage system for the enterprise. Last month that startup, Optical Archive, was acquired by Sony.
Is the enterprise ready for movie disks powering their long-term storage? For now, Facebook will continue testing the Blu-Ray cold storage racks. Very soon other hyperscale players and enterprises will have more information about the performance of Blu-Ray storage at scale, and its potential for use beyond Facebook and OCP. In the meantime, the data will keep coming.