What if you could review detailed ratings on every aspect of a data center’s performance, searching for colocation space the same way you might evaluate cars, homes or restaurants? The Infrastructure Masons are creating a data center ratings system to serve as a guide.
The industry group has developed a scorecard for data center users, similar to the buyer guides provided by Consumer Reports, CarFax or a local department of health. Data center operators can now provide letter grades based on their performance against key customer criteria. The goal is to create a standard methodology to streamline the colocation site selection process, while allowing data center operators to showcase their track records.
“What’s really important to our members is the performance of data centers,” said Dean Nelson, the founder of Infrastructure Masons. “There really is no standard way it’s measured today. There’s a whole lot of marketing out there, but there’s no real enforcement and no regulations. There are SLAs (service-level agreements) and audits of data centers, but the majority of the time you’re taking what the provider gives you because it’s all you’re going to get.”
This would raise the bar for providers, who will be asked to include self-reported rating data in their proposals and guarantee the accuracy of those claims by signing representation and warranty letters.
Current Tools Don’t Go Far Enough
Nelson says existing tools like the Uptime Tier System and the PUE efficiency standard are useful in evaluating providers and facilities, but only provide part of the picture.
“We want an actual measure of data center performance over time,” said Nelson, who heads the infrastructure team at Uber. “We don’t have that today.”
Nelson emphasizes that the development of the DCPI is member-driven and in its early stages. Infrastructure Masons has conducted five meetings with its end user members to gather feedback on the what the ratings should measure, and how. Colocation providers, consultants, engineering firms, and end users have all been contributed feedback.
Starting A Conversation
“This is a work in progress,” said Nelson. “We want to start a conversation for the industry, which will affect both leased and owned data centers. This is about identifying the problem.”
Infrastructure Masons was founded last year by Nelson to chart a course for the fast-growing cloud economy. The group’s members have built more than $100 billion worth of data centers, and includes leaders of the infrastructure teams at Facebook, Microsoft, eBay, Switch and Google.
The DCPI paper reviews the definitions and criteria used in the ratings, Data centers will be graded on a letter system, much like school report cards.
“We liken it to the health department scores for a restaurant,” said Mark Monroe, the Executive Director of Infrastructure Masons.
The Masons’ interest in a performance rating initially focused on availability, but has expanded to cover efficiency (based on PUE and WUE) and environmental attributes (green house gas emissions).
The leading existing measure of data center reliability is the Tier system developed by the Uptime Institute (now part of 451 Group), which assesses data centers and assigns them one of four tiers based on the redundancy and configuration of mechanical and electrical infrastructure. The tier system has value, but doesn’t go far enough, according to Monroe.
“It’s a look backwards at the design and operations at a point in time, and doesn’t really reflect outages,” said Monroe, a veteran of Sun Microsystems and DLB Associates. “In the last 12 months, what’s the availability? We don’t know yet what the end point will look like, but the universal feedback is that we need something like this.”
Early discussions have sought to strike a balance between the metrics requested by users and feedback from colocation providers about creating fair evaluation criteria. Nelson expected that users would be enthusiastic about a rating system, but says colo companies also see it as an opportunity.
“That has been a pleasant surprise,” said Nelson, who noted that IM members are representing themselves rather their companies, but said members from the provider sector have been actively engaged in the discussion. “We had people from five or six of the largest colo operators. They do want to show how they perform, and how they’ve done it. The feedback hasn’t been about having a rating, it’s been about how they are rated.”
Debate About Metrics, Renewables
One concern for colocation providers was the use of PUE (Power Usage Effectiveness) and WUE (Water Usage Effectiveness), which measure the efficient use of electricity and water, respectively. Both metrics are location-sensitive, creating a potential advantage for some facilities. The IM ratings will include climate zones to acknowledge this.
The environmental ratings were initially focused on the amount of renewable energy sourced by a data center. Service providers said this approach would penalize facilities in areas where the local utility doesn’t offer renewable options. Instead, the rating will track greenhouse gas emissions.
Perhaps the most detailed discussion was about measuring outages. Is an outage declared when a single data hall experiences downtime? Or only when an entire data center building loses power?
The current proposal uses availability zones – defined as “dedicated power, cooling and network connectivity shared by IT gear in a physically defined area of the data center.” Some providers may be graded on an entire building, others could have grades for multiple zones within a data center building, depending upon what infrastructure is shared or isolated.
Nelson said the paper outlines a framework, and Infrastructure Masons will continue to meet and hold conversations about details of the DCPI implementation, seeking to finalize the efficiency PUE and WUE ranges by climate zone, as well as environmental GHG emission ranges.
Designed for Colo Providers
The DCPI is being developed with colocation providers in mind, but “the belief is that this system could also be utilized by cloud and enterprise data centers,” especially for tracking and comparing performance of individual data centers across a fleet of facilities.
As issue is that the ratings data will be self-reported, with each provider developing and disclosing its grades. Nelson says the way to put teeth into the ratings is to include them in formal requests for proposal (RFP).Dean Nelson: The accountability is in self-reporting. It's about standing behind your numbers.Click To Tweet
“From an end user standpoint, we would put this in our RFPs,” he said. “A provider can do audits to verify their data.”
Providers would be required to include their DCPI evaluations in the representation and warranty letters included in response to RFPs. “The accountability is self-reporting,” said Nelson. “It’s about standing behind your numbers. The executive teams of these companies would now be more accountable.
“We don’t want people to be able to game the system,” he added. “We want it to all be measureable.”
An Internal Metric, But Maybe Not Forever
The process is designed so that the Infrastructure Masons can create the rating index and criteria, which is then requested by users and reported by colocation and cloud providers.
“What we want to do is influence what happens in the industry,” said Nelson. ” We do not want to be a standards body. The only way this will be effective is if there’s a commitment from the companies procuring data center space. The vast majority of providers believe a rating system would be good. ”
There are no plans for the Masons to play a role regulating the process, or for any central reporting venue. But the group believes that DCPI grades could quickly gain visibility across the industry.
“PUE started out as an internal metric,” said Monroe. “But if you have a good PUE, you can publish that on your web site. This may turn out to be similar.”
Nelson believes the DCPI is one way the Infrastructure Masons can move the data center industry forward.
“The power of Infrastructure Masons is a group of minds coming together to tackle common problems,” he said. “I”m feeling pretty confident, because we’ve had validation from all types of end users. We believe this framework is useful and will be helpful. I think this will happen sooner rather than later.”