Oracle’s Data Center Journey

Feb. 9, 2016
The evolution of Oracle’s Utah Compute Facility (UCF) illustrates ongoing changes in technology and best practices, as well as Oracle’s emphasis on continuous refinement of data center design.

A trip through the Oracle Utah Compute Facility provides an unusual window into the data center industry’s progress on design. The 200,000 square foot data center in South Jordan, Utah delivers the cloud infrastructure to support Oracle’s on-demand computing business.

The first phase of the facility features a raised-floor data hall, battery-based UPS systems, direct evaporative cooling system, and chillers for backup cooling.

The second phase is a different story. Rather than a single data hall, it’s divided into six modules that will be built in phases. There’s a slab floor and a ductless cooling delivery system, which is supported by a new indirect evaporative cooling system and tight airflow containment for the racks. The power infrastructure also has a different look, as flywheel UPS systems have replaced the batteries.

The evolution of the Utah Compute Facility (UCF) illustrates ongoing changes in technology and best practices. It also reflects Oracle’s emphasis on continuous refinement of data center design to strike a balance between sustainability, cost and operational efficiency.

“One of the ways we work at Oracle is to use each project as a way to improve this growth journey,” said Michael Thrift, the Director of Data Center Facilities at Oracle. “Innovation happens when we look at problems holistically, and eliminate organizational boundaries to find creative solutions.”

An aerial view of Oracle’s Utah Compute Facility in South Jordan, Utah. (Image: Oracle Corp.)

The Utah Compute Facility provides a case study of this process, which features three steps – evalute, pivot and adapt. Thrift and Oracle’s team of design and construction vendors shared the details of that process recently at the 7×24 Exchange Fall Conference in San Antonio.

The Backstory: Austin and the Dawn of Containment

Oracle has been a key contributor to advances in data center design, dating to 2004 when the company created one of the first airflow containment systems while expanding the company’s colcoation facility in Austin, Texas.

The racktop chimney containment system in Oracle’s Austin data center, circa 2004. (Photo: Oracle)

The Oracle team improvised a rack-based containment system that vented hot server exhaust air from the top of the rack into a ceiling plenum and back to the CRAC (computer room air conditioner). Oracle then used variable speed fans to adjust the airflow speed, reducing the power required to cool the system.

“At the time it was all hot aisle/cold aisle,” Oracle’s Mukesh Khattar told me in 2010. “This project debunked a lot of common myths associated with variable airflow in data centers and clearly demonstrated its cost effectiveness.”

As on-demand cloud technologies emerged as business driver, Oracle began planning its next phase of growth with a multi-state search for a site for a $200 million data center project. Utah won out over Idaho based on the cost of energy, local workforce and pro-business climate. The new facility in South Jordan, Utah was dubbed “Project Sequoia” to reflect Oracle’s ambition for a sustainable, energy-efficient “evergreen” data center.

2011: Phase I

Oracle announced Project Sequoia in 2008, but the project was soon put on hold amid the financial crisis and Oracle’s acquisition of Sun Microsystems, which also had a sizable data center portfolio. Construction on the 25,000 square foot first phase resumed in 2010, with several changes from the Austin design. Chief among these was a shift to direct evaporative cooling, using fresh air instead of chilled water to cool the servers. At the time, direct air “free cooling” was in its early phases of adoption.

Oracle’s design team believed it could run with fresh air up to temperatures of 85 degrees due to Utah’s low humidity. Four 1,000 ton chillers were installed to provide cooling on hot days, consistent with Oracle’s focus on efficiency in energy and cost while taking no chances on uptime.

“We’re not ever going to be on the bleeding edge of technology,” said Thrift. “But we’re going to continue to drive smart, sustainable innovation into the industry. We’re not making this commitment to sustainability because it’s a fad. This sustainability has to pencil out for the bottom line – sustainable sustainability.”

Tom Dobson, a vice president at Holder Construction, outlines some of the innovations at Oracle’s Utah Compute Facility during a November 2015 presentation at the 7×24 Exchange Conference in San Antonio. At right is Michael Thrift, Oracle’s Director of Data Center Facilities (Photo: Rich Miller)

The new cooling system was indeed efficient, helping Oracle reduce its PUE (Power Usage Effectiveness) to 1.3, an improvement on the 1.46 PUE in Austin. But those gains came with overhead.

“The direct evaporative unit is difficult to control,” said Tom Dobson, Vice President of Holder Construction. “All of the changes you need to make on the control side turned out to be pretty tricky. It took us three years to optimize.”

There was good news as well. “It turns out the chiller plant was completely unrequired,” said Dobson.

2013: Phase II

As the first phase filled up, Oracle applied its “evaluate, pivot and adapt” process – and found plenty of opportunities to refine the design.

“I was committed to making things different for phase 2,” said Thrift. “We decided we wanted to put all the ideas on the table.”

Oracle organized a design charrette among its vendors, including Holder Construction and Glumac Mission Critical. The challenge was to build in flexibility as they evolved from the original design. The team decided on a number of changes:

  • The adoption of a “build as you grow” strategy, dividing the 30,000 square foot data hall into six modular phases to avoid stranded capacity. This approach also allows Oracle to adapt the design as it goes, adjusting for changes in server technology.
  • Shifting to an indirect evaporative cooling system, which Thrift noted was was not as efficient as the direct cooling but offered other benefits, including a simpler control system requiring less fine-tuning, and no worries about the adverse effect of outside air. The system works well with Oracle’s supply air temperature (72 degrees) and supports a wide range of humidity.
  • The new cooling system also uses less water, which was an important sustainability priority. “Water is a huge resource issue in the western United States,” said Samuel Graves, Associate Principal, Glumac Mission Critical. “We had to help Oracle figure out how we can still use these great direct and indirect evaporative systems but use them smarter.”
  • The updated designs shifts the air handlers to the roof, and adopting a ductless cooling system featuring multiple “decks” housing plenums to separate and transport hot and cold air. Supply air is dropped into the data hall, and from there is distributed to provide cooling for electrical rooms and telecom space. Racks are housed in a containment system, with a chimney system venting exhaust air from the hot aisle into the upper return plenum.
  • A flywheel UPS system replaces the batteries, conserving space and providing a “greener” solution. A flywheel is a spinning cylinder which generates power from kinetic energy, and continues to spin when grid power is interrupted. Although a flywheel offers a shorter ride-through time (about 20 seconds), it eliminates the need to replace batteries, as well as some challenges with “eco-mode” energy efficiency settings on the UPS system in Phase I.
  • In the power system, Oracle ran medium voltage to building, with each modular pod/block having its own generator.

The end result: A PUE of 1.18, with capacity closely matched to demand and simpler operational controls. Here’s a cut-away overview of the facility and cooling airflow.

The Road Ahead

The Oracle team says the revisions to Phase II provide improved efficiency and sustainability, and the flexibility to adapt the design in future modules. Holder’s Dobson attributed the gains to Oracle’s process and the collaboration between vendors and service providers.

“This type of innovation doesn’t happen just by buying it through a contract,” said Dobson. “When we encountered questions, we accepted that the answer might be ‘I don’t know.’ But we decided that we were going to take the time, do the engineering and figure it out. As a result, we’ve got a pretty incredible project.”

Thrift said Oracle was pleased with the outcome, and believes the experience positions it for future design refinements in a changing data center landscape.

“These projects demonstrate the ability to learn and stay ahead of the industry.” said Thrift. “The data center of 10 years from now will be different in ways we can’t possibly imagine.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Getty Images, courtesy of Schneider Electric
Source: Getty Images, courtesy of Schneider Electric

Minimizing Cyber Risk and Doing it in a Simplified Way – the Key to Secure IT Infrastructure Success

Kevin Brown, SVP EcoStruxure Solutions, Secure Power for Schneider Electric, outlines two major areas of focus when it comes to cybersecurity.

White Papers

Dcf Venyu Wp Cover 2021 07 12 7 15 51 233x300

The Business Case for Data Center Geo Diversity

July 13, 2022
Geo diversity, or shortening the distance that your data travels, will allow you to reaching your user bases more effectively, and create better customer experiences. This white...