• About Us
  • Sponsorship Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
  • Energy
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Cloud / Inside Facebook’s Blu-Ray Cold Storage Data Center

Inside Facebook’s Blu-Ray Cold Storage Data Center

By Rich Miller - June 30, 2015 16 Comments

Inside Facebook’s Blu-Ray Cold Storage Data Center

A row of storage units housing Blu-Ray disks inside Facebook's North Carolina data center. (Photo: Rich Miller)

LinkedinTwitterFacebookSubscribe
Mail

FOREST CITY, N.C. – The temperature remains constant as you walk through Facebook’s custom data storage facility. But as you approach the back of the room, you transition from cold storage to even colder storage.

In a row of 14 racks housing square enclosures, Facebook is test-driving the future of its long-term data storage. The racks are packed with thousands upon thousands of Blu-Ray disks.

That’s right: the optical media that plays your movies can now back up all your status updates and Facebook photos.

The Blu-Ray storage system is part of the company’s evolving effort to manage a flood of incoming data, with users uploading more than 900 million photos every day. To keep pace with all those uploads, Facebook must constantly seek new ways to add capacity.

“It’s amazing how much storage you can do with Blu-Ray,” said Keven McCammon, Datacenter Manager for Facebook’s facility in western North Carolina. “There are times when you can look back to look forward.”

Facebook Blu-Ray cold storage

These storage units inside Facebook’s data center in North Carolina are filled with Blu-Ray optical disks. (Photo: Rich Miller)

The racks of Blu-Ray storage are still in the testing phase. But Facebook has high hopes for Blu-Ray as a tool to optimize its infrastructure. McCammon and his team are putting the system through its paces at the company’s massive East Coast data center campus.

“We’re doing some pretty extensive testing right now,” said McCammon. “We want to really make sure it can function in a production environment and can scale.”

With 1.44 billion users, Facebook’s storage needs may seem otherworldly to most users. But many companies will soon face similar challenges managing the explosive growth of data storage. The hyperscale data center players pioneer the design strategies and best practices that will soon filter into many other data centers. Facebook’s cold storage journey offers insights for managing the coming data tsunami.

Retooling Tiered Storage for the Hyperscale Age

Facebook’s cold storage system is a web-scale implementation of tiered storage, a strategy that organizes stored data into categories based on their priority and then assigns the data to different types of storage media to reduce costs. The goal is to create a top tier consisting of high-performance enterprise hardware and networking, while lower tiers can use commodity hardware or, for rarely-used assets, backup tape libraries.

The storage world has changed since tiered storage made its debut in 1990, but Facebook is applying many of the principles in its infrastructure, albeit with different technologies.

Free Resource from Data Center Frontier White Paper Library

Edge Cloud
The Benefits of Edge Cloud On-Ramps for CSPs & End Users
Download the new report from EdgeConneX to explore the public cloud's performance, security and cost issues and examine what can be done to resolve them. The white paper also explains the cost, performance, security and other benefits of leveraging edge cloud on-ramps access to access the hyper scale public cloud providers, and examines how CSPs can use this local infrastructure to promote their own connectivity infrastructure. 
Download

Facebook was an early adopter of solid state drives (SSDs), storage devices that use integrated circuits as memory. SSDs have no moving parts, unlike traditional hard disk drives (HDDs), which contain spinning disks and moveable read/write heads. Most importantly, SSDs are faster than hard disks, and can accelerate key parts of Facebook’s infrastructure.

While focusing on SSD and Flash in the high-performance portions of its storage infrastructure, Facebook continues to use plenty of hard disk drives to store photos. In 2012 it created a custom storage tray through the Open Compute Project, known as Knox.

facebook cold storage-data-center

One of the Facebook cold storage data centers in Forest City, N.C. The louvers lining the walls of the building bring fresh air into the facility to cool the storage racks. (Photo: Rich Miller)

By 2013, Facebook was storing more than an exabyte of images that were rarely accessed, with 82 percent of traffic focused on just 8 percent of photos. So the company created a “cold storage” design to shift these rarely-viewed photos from its expensive high-performance server farms to simpler data centers with no generators or UPS (uninterruptible power supply). Emergency backup power isn’t needed because the facility does not serve live production data.

“Reducing operating power was a goal from the beginning,” wrote Facebook’s Krish Bandaru and Kestutis Patiejunas in a blog post outlining the cold storage system. “So, we built a new facility that used a relatively low amount of power but had lots of floor space. The data centers are equipped with less than one-sixth of the power available to our traditional data centers, and, when fully loaded, can support up to one exabyte (1,000 PB) per data hall.”

Facebook operates cold storage facilities at its Prineville, Oregon and North Carolina data center campuses. Custom software optimizes the energy use of trays filled with commodity HDDs. By reducing the disk activity and power draw, the design has slashed the amount of airflow needed to cool the racks.

Facebook wasn’t done yet. At the 2014 Open Compute Summit, it unveiled a prototype for an “ultra-cold” storage system using Blu-Ray disks as the storage medium and a robotic arm to retrieve data. The robotic arm is similar in concept to systems used to retrieve tape cartridges in tape storage libraries. (See the prototype in action in this video from Facebook).

Blu-Ray is an optical data storage format that uses blue lasers to read the disk. It’s not for primary storage, as data can’t be retrieved instantly. But using Blu-Ray disks offers savings of up to 50 percent compared with the first generation of Facebook’s cold storage design, since the Blu-Ray cabinet only uses energy when it is writing data during the initial data “burn,” and doesn’t use energy when it is idle.

How Design Tweaks Drive Big Savings

Facebook has built three cold storage data centers on the Forest City campus. Only one is in use, with the other two reserved for future capacity. The single-story buildings are a departure from the company’s primary server farms, which have an upper floor that functions as a huge cooling plenum, treating fresh air to be used to cool servers in the data halls on the first floor.

In the cold storage design, air enters the facility through louvers in the side of the building. The cooling is handled by a series of air handlers along the exterior wall, which consolidate the multi-step cooling and filtering into a single piece of equipment.

Facebook cold storage air handlers

When fresh air enters the building, these air handlers filter and cool it for use in cooling the many racks of storage gear inside the cold storage facility. (Photo: Rich Miller)

The cold storage racks are more densely packed than a standard Open Compute Knox storage units. A cold storage rack can hold 32 trays, which each contain 15 standard 4 terabyte drives, allowing Facebook to store nearly 2 petabytes of data in every rack.

“We spin them up, fill them with data and then spin them down. Every now and again we spin them up again to make sure they’re working fine,” said McCammon, who noted the design “doesn’t draw as much power, and you don’t need as much air. In the other data hall, the drives are always going.”

The Blu-Ray racks are even more efficient. They are different in appearance from the Knox storage trays, with five units per Open Rack, each housing carousels filled with Blu-Ray disks. When data must be accessed, the action happens in the rear of the unit. That’s where the robotic retrieval system has been condensed into a space in the bottom of the rack. When it springs into action, the arm travels along tracks on either side of the rack, fetches a Blu-Ray disk, pulls data off the disc and writes it to a live server.

facebook-cold-storage-rack“In a BluRay scenario, you don’t even need to spin them up,” said McCammon. “It also takes less power to perform a write. It also reduces our network traffic, and so we need fewer switches.”

“It’s incredibly efficient,” he added. “It’s truly lights out.”

Facebook isn’t alone in tapping Blu-Ray as a medium for data storage archives. It’s a concept that dates to  optical jukebox systems, and companies like Hie Electronics and DISC Group offer smaller scale data archives using Blu-Ray and robot retrieval.

Through the Open Compute Project, the Facebook Blu-Ray initiative has created a commercial spinoff of its own. Last year Facebook hardware guru Frank Frankovsky left to found a startup focused on adapting the Blu-Ray optical storage system for the enterprise. Last month that startup, Optical Archive, was acquired by Sony.

Is the enterprise ready for movie disks powering their long-term storage? For now, Facebook will continue testing the Blu-Ray cold storage racks. Very soon other hyperscale players and enterprises will have more information about the performance of Blu-Ray storage at scale, and its potential for use beyond Facebook and OCP. In the meantime, the data will keep coming.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Blu-Ray, Facebook, North Carolina, Storage

Stay informed: Get our news updates!

* indicates required
Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Comments

  1. ajzimm3rman@gmail.com'andrew says

    July 1, 2015 at 11:51 am

    I figured 800 million * 2mb, divided by 1000 twice equals 1800TB per day, or 1.8 PB a day. That’s 657PB a year.

    I can see why they have 1,000 PB per hall. If a TB were to cost $40, the cost per hall is at least $40,000,000 in storage.

    Reply
    • rez@triton.net'Adam says

      July 1, 2015 at 4:17 pm

      However Facebook reduces all it’s images down to about a max of 1024px wide or high, whichever is greater. I just checked I posted n FB over the weekend. My original is 3120×4160 and 2.5mb. The FB one is 720×960 and 112KB. So much smaller.

      Reply
      • Rich Miller says

        July 1, 2015 at 4:27 pm

        Re Andrew’s math: Facebook also creates multiple copies of each image, including sizes for thumbnails, images in the feed and individual photo pages. So even at a smaller size, the math may be different but it still requires plenty of storage.

        Reply
        • admin@nekohime.pl'n0zz says

          July 1, 2015 at 6:31 pm

          Isn’t it like that you can download full resolution full size photo? So they have to store them also, it makes it even bigger than oryginal size, because of smaller versions, thumbnails and rest of it.

          Reply
          • gurjindersingh@hotmail.com'Gurjinder singh says

            July 7, 2015 at 8:49 am

            even multiple diff version for mobile too

  2. donwalden@gmail.com'Don says

    July 1, 2015 at 12:11 pm

    Have no idea why this is even news. Or ground breaking, this type of storage has been around for since the 1980’s Optical Jukeboxes are nothing NEW. The only real difference is they are using BluRay. WORM and COLD is nothing new. I guess FB is becoming Apple, taking a technology that has been around for years, and claiming it as their own and then say they are on the CUTTING EDGE of everything NEW. GOT iPod. oh ya At&t got it, Creative got it, RIO got it…

    Reply
    • Rich Miller says

      July 1, 2015 at 2:05 pm

      Hi Don. Thanks for your comment. The technologies themselves are not new, as we noted in the article. At DCF we’re all about the data centers, so the most interesting piece for me is the use of cold storage and Blu-Ray to enable a new design that slashes energy costs. To a large degree, this approach is enabled by the scale at which Facebook operates. For many users with lesser storage requirements, the returns on a custom storage facility would be less attractive. But this approach may be interesting to other hyperscale players, and those about to be hyperscale. And since the design is available through the Open Compute Project, others can leverage Facebook’s work (just as Facebook surely learned from earlier implementations of optical jukeboxes).

      Reply
      • dutchs@uwgb.edu'Steve Dutch says

        July 1, 2015 at 9:49 pm

        For that matter, there’s nothing new about the Burj Dubai, either. Frank Lloyd Wright designed a mile-high skyscraper 60 years ago. What is new is that people actually need things on that scale now. Burj Dubai is more showing off than real need, but we really do need exabyte storage. The real question is, when will we hit the practical limits of storage capacity?

        Reply
  3. peter.roehlen@gmail.com'Peter says

    July 1, 2015 at 1:57 pm

    Interesting article. Would have been nice to hear how the ultra-cold storage translates to user experience. If I click a year 5 years ago on my timeline then click the first photo thumbnail that appears am I going to be waiting while a robotic arm loads a disc with the image? What sort of delay would that incur?

    Reply
    • goolsbee@goolsbee.org'Chuck Goolsbee says

      July 6, 2015 at 4:50 pm

      To answer your question: None. By design the photo will always remain on a spinning disk in the storage tier of the live datacenter. The copies in Cold Storage are only accessed to recover data lost from that tier through hardware failure or human error. Cold Storage is a near-line backup system.

      Reply
  4. J@hhj.com'J says

    July 1, 2015 at 4:07 pm

    I’d like to see the statistics on data loss per user. It is not uncommon for someones older photos to be non existent and other images/events removed from the timeline.

    Reply
  5. per@yttibrium.com'Per says

    July 1, 2015 at 7:40 pm

    Being the architect of this and other solutions, I would like to address some of the concerns.
    1) Access Time. this system is a tertiary system. meaning that a primary copy is in ‘hot’ storage with replicas as needed globally. The ‘main’ copy is kept erasure coded across multiple racks to avoid it’s failure. and just in-case, a cold copy is created. -contact me if you want to know more.
    2) Noone is claiming the technology is new, only that the cost points for long term durability have made an old technology viable. Tape has a much higher loss rate, and higher cost per EB stored over time, esp. when you start to look at a time scale of 50+ years.
    Finally, I would encourage the rest of you to learn what happened to me. I entered FB as a ‘grey hair’ and was relegated to a separate building, yet every day was being taught that these new generation ‘kids’ COULD solve problems we had already ‘proved’ unsolvable.

    Reply
  6. not_on@facebook.com'a d00d says

    July 1, 2015 at 9:17 pm

    Don, the base tech isn’t new (outside of BD itself, of course), but rather the tiered storage is new. Optical Jukeboxes are “old” tech, but whole optical juke-racks are definitely new, along with controlling all that storage.

    Optical has also gotten a bad rap for reliability, along with tape. Of course, it turns out when you crunch the numbers that there is no such thing as reliable storage over the long term. The closest thing is the M-Disc, but you’ll need to ask your grand-kids or one of your local time-traveling doctors to find out if they last even 50 years, much less a millenium as they claim.

    I agree with you about Apple, but I have to defend FB in this case in that this is a new and improved (really!) use for old technology.

    Oh, and one last thing: don’t forget that they’re working on the next optical standard that’s supposed to be at least 10x higher storage density. This research was faltering thanks to flash keys, but if this kind of giant juke-center storage works for the big boys, it may just resurrect the medium–at least, as I said, for the big boys.

    Reply
  7. stiophand@gmail.com'Stephen says

    June 7, 2017 at 5:00 pm

    I recall IBM and others showing off 3D storage ie some glass like block that could hold multi-TB via blue lasers writing or reading from a 3D location within the block. We’ve also seen some mention of organic storage on a DNA like level. I presume some of this may already be in place for military demands otherwise what happened to this research…While BR-D might seem like a great old idea that avoids tape degredation issues, we all know optical disks were not the scratch resistant medium we were all promised and BR compresses the data into even smaller channels of the disk so the room for error recovery is even smaller. At some stage you will either have to pay for this long term storage or pay to hold those memories at home.

    Reply
  8. rnewman@trendoffset.com'Bob Newman says

    February 3, 2019 at 6:18 pm

    Anyone heard of HSM?? Heirarchical Storage Management……Hasn’t this been the basis of Corp Backup for decades? I remember designing a solution for NASCAR Images in 2006 that used 3 tiers of varying performance and relative cost basis storage methodologies to meet retrieval requirements and still hit cost goals. It utilized SANs using the highest performance spinning drives as tier one, lower performance drives as tier 2, and Spectrologic LTO Libraries as Tier 3.

    Reply
    • a d00d says

      October 14, 2019 at 9:19 am

      This is the FAR back-end of HSM. This is roughly equivalent to some old tapes or photos you dig up as you’re digging out the 30+ year old storage unit left by someone who kicked the bucket. Its intended to be the absolute last resort to hold data when all else fails, or for archival data that you want to preserve but rarely (if ever) get accessed.

      Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Facebook
  • Google+
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

How Automation is the Precursor to AI in the Data Center

How Automation is the Precursor to AI in the Data Center A guest article from CoreSite asserts automation is shifting data centers from reactive to preventative on the way to predictive. Brenda Van der Steen, VP of Marketing at CoreSite, explores how automation acts as not only a precursor to artificial intelligence, but a crucial "step in the journey."

DCF Spotlight

hyperscale data center

The Hyperscale Data Center Drives the Global Cloud Revolution

The hyperscale data center is reshaping the global IT landscape, shifting data from on-premises computer rooms and IT closets to massive centralized data center hubs. Explore further how cloud campuses will continue to enable hyperscale operators to rapidly add server capacity and electric power.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

White Papers

Phoenix colocation market

Phoenix Data Center Market Special Report

The Phoenix market is currently experiencing a data center building boom, with hundreds of megawatts of potential capacity in the pipeline. The Phoenix region benefits from power and fiber infrastructure, and growing competition among service providers. Download the new Data Center Frontier Special Report, courtesy of Iron Mountain Data Centers, that provides market analysis, and information on the business environment and major data center players in the Phoenix data center market. 

Download
See More White Papers »

Get the Latest News from DCF

Job Listings

RSS Job Openings | Peter Kazella and Associates, Inc

  • Mechanical Commissioning Engineer - Severn, MD
  • Design Project Manager - Boston, MA
  • Electrical Commissioning Engineer - New York, NY
  • Project Manager - Data Center Design - Washington, DC
  • Data Center Facility Engineer - Crabapple, GA

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 15 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Hyperscale
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Virtual Reality
  • Voices of the Industry
  • White Paper

Copyright Data Center Frontier LLC © 2019