• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Design / Inside LinkedIn’s Cutting-Edge Portland Data Center

Inside LinkedIn’s Cutting-Edge Portland Data Center

By Rich Miller - December 1, 2016 1 Comment

Inside LinkedIn’s Cutting-Edge Portland Data Center

The LinkedIn data center in Portland, Oregon is an example of a design tailored to the needs of the company's application, combinning extreme density and extreme efficiency. (Photo: LinkedIn)

LinkedinTwitterFacebookSubscribe
Mail

The new LinkedIn data center in Hillsboro, Oregon breaks new ground in Internet-scale data centers, combining high density with exceptional efficiency and sustainability. The facility, housed in a new phase of the Infomart campus outside Portland, is the culmination of an 18-month journey in which LinkedIn has doubled its infrastructure and overhauled its design.

The new design includes one of the largest deployments yet of rear-door chilling units that support extreme power density. The facility also makes use of free cooling via water-side economization, and runs on 100 percent renewable power. It is connected by a new network fabric, known as Project Altair, designed to optimize latency.

To learn more about this unique facility, Data Center Frontier had a question-and-answer session with the team behind the data center:

  • Mike Yamaguchi, LinkedIn’s Director of Data Center Engineering
  • Shawn Zandi, LinkedIn Principal Network Architect
  • John Sheputis, President of Informart Data Centers
  • Julian Kudritzki, Chief Operating Office of the Uptime Institute, which awarded the LinkedIn project its Efficient IT Stamp of Approval.

Data Center Frontier: One of the most interesting elements of the LinkedIn Portland project is the cooling design. What led you to use this approach (rear-door chilling unit from MotivAir) and what were the pros and cons of this design?

Mike Yamaguchi (LinkedIn): There were two main factors that led us to deploy a rear door heat exchanger technology in our Oregon facility. First, we were challenged with supporting cabinet designs up to 24kW. Second, we wanted to achieve the highest level of efficiency. By transferring the heat generated by the high density compute nodes to water closer to the source, we are able to leverage the naturally cool ambient Oregon air nearly 220 days a year to reject that heat, as opposed to using energy to create cool water.

We collaborated with Infomart, DPR, and MotivAir to build customized automated logic to maximize the operating efficiencies of the doors by monitoring several input factors such as air pressure, water pressure, and water temperature. One major benefit was our ability to hit a PUE of 1.06, but operationally speaking, this design specifically allows us the flexibility to have a wide variety of cabinet densities, while maximizing our space, power, and cooling. The trade-off of course was adding an additional design element to every rack.

Here's a closer look at the MotivAir rear-door chilling units installed in the "hot aisle" in the LinkedIn data center in Portland, Oregon. (Photo: LinkedIn)

Here’s a closer look at the MotivAir rear-door chilling units installed in the “hot aisle” in the LinkedIn data center in Portland, Oregon. (Photo: LinkedIn)

Data Center Frontier: What prompted your choice of Portland and Infomart for this project?

Shawn Zandi (LinkedIn):  Several key considerations led us in selecting Hillsboro and Infomart as the next location for our flagship data center. For the locale, we were/are able to leverage the Trans Pacific cable system to connect to our global backbone and leverage free cooling the majority of the year. As for Infomart, their building has direct access to renewable energy, expansion opportunities and a collaborative team that was as excited to work with us on our design as we were.

We worked toward a solution to maximize the rack power density against design capacity. Dense compute project, we were able to fit into higher power per sq/ft which translate into gain in efficiency

Data Center Frontier:  The Project Altair facility supports density that is significantly higher than many other hyperscale data centers. What led you to this approach, and what are its benefits?

Free Resource from Data Center Frontier White Paper Library

data center cooling
Is Your Data Center Partner Optimizing Its Cooling Strategy for Efficiency and Uptime?
When you are selecting a data center partner, ask about the cooling approaches they use. The answers can provide insights into the data center’s ability to meet your expectations and your company’s sustainability goals. This new white paper from Trane explores the ins and outs of data center cooling, and how to ensure the most efficient solution. 
We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Get this PDF emailed to you.

Mike Yamaguchi, LinkedIn: In addition to dense-compute, the move to pizza box (1 rack unit) building blocks instead of big chassis, also enabled us to save power and space in addition to maintaining baseline temperature.

One thing we were able to do is standardize on components that were not only more economical as being commoditized, but offered more overall throughput. Once we began to realize those efficiencies, we wanted to push the envelope as much as was prudent. This design is one we want to eventually replicate in our other data centers.

A row of servers inside the LinkedIn Portland data center. The Project Altair design supports densities of up to 24 kW per rack. (Photo: LinkedIn)

A row of servers inside the LinkedIn Portland data center. The Project Altair design supports densities of up to 24 kW per rack. (Photo: LinkedIn)

Shawn Zandi (LinkedIn):  The other consideration is that for our private cloud architecture, LinkedIn Platform-as-a-Service (LPS), we wanted to create an low latency data center fabric that allows service instances in any part of the data center to communicate with each other seamlessly. This is in contrast to the higher latency architectures used by most public cloud providers as they are built as based on a multi-purpose and multi-tenant foundations.

Data Center Frontier: LinkedIn Portland is perhaps the most efficient space deployed by a multi-tenant provider on behalf of a client. What has been different about this project that created this level of efficiency?

John Sheputis (Infomart Data Centers): While the project PUE is excellent, the more innovative operational efficiencies are gained in how cooling and power resources are controlled and delivered. We can discuss this in terms of the external plant, cooling in the data hall, and conditioned power delivery.

The plant design started with fairly efficient cooling towers and chillers, and took advantage of Portland’s climate to leverage water side economization. While there are occasional hot days in Portland, summers tend to be drier and milder compared to most of the US, meaning the plant operates in ‘free cooling’ conditions the vast majority of the year.

An aerial view of the Infomart Data Centers campus in Hillsboro, oregon which houses the new LinkedIn facility. (Photo: LinkedIn)

An aerial view of the Infomart Data Centers campus in Hillsboro, oregon which houses the new LinkedIn facility. (Photo: LinkedIn)

The bigger breakthrough was inside the data hall. Every data center designer knows that containment (hot or cold aisle) improves efficiency. But rather than contain a row as we would in most cases, by using rear door heat exchangers, we drove containment to the rack level. This approach gave us an unprecedented level of control in managing the cooling resource to exactly where and when needed. The improved control allowed us to raise the delivered water temperature. With a cold air objective of 75 degrees, the chilled water supply can be as warm as 67 degrees. Producing warmer chilled water saves energy year round and expands the number of ‘free cooling’ days. There are additional benefits to warmer water, namely eliminating condensation risk. For any acceptable range of relative humidity, the pipes will be above the dew point. Less insulation on pipes lowers construction and maintenance costs as well. We plan to use warmer water in future projects, with or without rear door heat exchangers.

Last, our production and delivery of conditioned power is state of the art. We start with highly efficient UPS, and operate in the eco-mode, which saves power through less dual conversion. The conditioned power is then delivered to the floor at 415 volts, which lowers line loss. Power is delivered to 400A Starline bus above the racks, which allows more flexibility for rack densities and results in less stranded capacity. We plan to use higher delivery voltage in future projects as well.[clickToTweet tweet=”John Sheputis: This approach provided an unprecedented level of control in managing the cooling resources.” quote=”John Sheputis: This approach provided an unprecedented level of control in managing the cooling resources.”]

Data Center Frontier: The Uptime Institute has seen a lot of data center deployments. What are the aspects of LinkedIn Portland that stand out and differentiate it? Are these innovations specific to this project or do they have benefits from the broader industry?

Julian Kudritzki (Uptime Institute): LinkedIn Infomart-Portland is the poster child for Efficient IT by establishing core directives at the top that resonated through the organization and aligned a variety of expertise and staff levels. The goals of IT Optimization and Reduced Waste were emulated in the organizational structure itself—with a sustainability stakeholder embedded within the Data Center Engineering and IT Infrastructure group. At the rate of growth, and the immediacy of their business requirements, LinkedIn’s governance-level discipline is all the more impressive. While not every organization is at the scale of LinkedIn, the tenets of organizational commitment and comprehensive approach can be mirrored in enterprise IT organizations of any size.

For more information:

  • LinkedIn Engineering Blog: Introducing LinkedIn’s West Coast Data Center by Michael Yamaguchi.
  • Infomart Blog: Enabling the Most Efficient IT Operation in the World by John Sheputis

LinkedIn will discuss its Portland project this Tuesday at the Gartner Data Center, Infrastructure & Operations Management Conference in Las Vegas. The keynote presentation will feature Sonu Nayyar is the Vice President of Global Operations at LinkedIn.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Infomart Data Centers, LinkedIn, Portland

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Comments

  1. kritton@worksafetech.com'Kris Ritton says

    December 8, 2016 at 1:48 pm

    It looks very nice… But sadly one component that gets missed over and over in these new PNW data centers is protection from earthquakes. I see all the cabinets bolted down to the floor, but we all know that rigid securing is ONLY a life safety level of protection. When that building shakes in a large seismic event (and it will), all the equipment in it will shake as well, with the energy getting amplified through rigid braced components.
    In Japan, all companies are either using building level base isolation, or room/component level isolation, such as WorkSafe Technologies ISO-Base Platform. Many old companies in the PNW are using these systems, but few of the new large co-lo/ managed service facilities popping up here are implementing this protection.

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Reduce Your Water Footprint Cost-Effectively: Three Tips for Data Centers

Reduce Your Water Footprint Cost-Effectively: Three Tips for Data Centers Cem Candir, CEO of Chemstar WATER, provides three tips for data centers looking for cost-effective ways to reduce their water footprint. 

DCF Spotlight

The COVID-19 Crisis and the Data Center Industry

The COVID-19 pandemic presents strategic challenges for the data center and cloud computing sectors. Data Center Frontier provides a one-stop resource for the latest news and analysis for decision-makers navigating this complex new landscape.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

White Papers

disaster recovery

How Colocation Data Centers Support Business Continuity and Disaster Recovery

A strong disaster recovery (DR) plan is like a lifeboat that preserves a business’s key resources to ensure business continuity. Get this new e-book from Sabey Data Centers that explores how through partnering with colocation providers, businesses can gain essential support, including strategic geographic positioning, reliable and sustainable power infrastructure, expert remote hands services, flexible sizing as DR capacity and more. 

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Peter Kazella and Associates, Inc

  • Construction Project Manager - Data Center - Bryan, TX
  • MEP Superintendent - Data Center - Bryan, TX
  • Data Center Facility Manager - San Jose, CA
  • Construction Project Manager - Data Center - Papillion, NE
  • Managing Principal - Data Center Division - Phoenix, AZ

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Data Center Frontier LLC © 2021

X - Close Ad