Weathering a Perfect Data Storm with Centers of Data Exchange

Oct. 14, 2019
Tony Bishop, Senior Vice President, Platform & Ecosystem Strategy at Digital Realty, explores how to weather today’s perfect storm of digital data, IT leaders must adopt a strategy that implements local network ingress/egress within proximity to centers of data exchange, interconnected to digital ecosystems and tailored to business needs.

Tony Bishop, Senior Vice President, Platform & Ecosystem Strategy at Digital Realty, explores how to navigate todays explosion of digital data. Data centers are going through a digital transformation and being reimagined instead as ‘data exchange centers’, Bishop says. 

Tony Bishop, SVP, Platform & Ecosystem Strategy, Digital Realty

The Perfect storm of Digital Data

Everyone knows the best time to prepare for a storm is before it actually happens—but what happens when you end up right smack in the center? In a digital sense, we’ve been braving seismic gusts of disruption for decades. That’s the nature of the internet. Yet, what’s different now is the speed of technology innovation continues to accelerate at an alarming rate—creating tidal waves of data—and every company has specific nuances and risks exploited by data gravity that can’t be addressed with a one-size-fits-all cloud concept. As customer-centric enterprises find themselves in a vulnerable predicament, IT leaders aren’t asking “why?” or “how?” but proactively seeking strategies to weather this perfect data storm, remove data gravity barriers, and emerge with calm waters and clear skies.

Where is Data Taking Us?

As enterprises face an ever-increasing influx of data points, data gravity barriers will impede even the most mature IT infrastructure. Less than a decade ago Dave McCrory introduced data gravity, the “idea that data and applications are attracted to each other, similar to the attraction between objects that is explained by the Law of Gravity. As data sets grow larger and larger, they become harder to move…it’s the gravity that moves to where the data resides.” While opportunities with using data seem endless, it’s important not to underestimate the potential downside of data gravity barriers, which may prevent IT from accommodating global workflows in order to address unique participants, applications, information, and location-specific needs. The result includes increased difficulty, costs, and risks to enterprise IT infrastructure design.

  At the Center of Data Exchange

It’s expected that businesses depend on the direct, private, and secure exchange of data—but having the flexibility to respond rapidly to shifting markets and customer behavioral changes that drive critical demands from the business will prove paramount. If businesses don’t have the ability to scale-at-will, it can cost IT big time in terms of performance, cost management and security. Thus, challenges with legacy infrastructure arise when scaling to support global workflows.

As enterprises face an ever-increasing influx of data points, data gravity barriers will impede even the most mature IT infrastructure.

Current Risks that Impede IT

The ability to mitigate data gravity barriers is critical in order to enable global distributed workflows. And it’s not just about the growing data creation, usage and exchange, but also the challenges that arise from it. There’s a need to shift. Data processed outside of a traditional centralized hub will increase from 10% to 50% by 2022, according to Gartner. With all this data migrating, processing and moving, it’s critical that IT leaders prepare for challenges including:

  • Increasing costs.Building and maintaining data centers that can scale to the needs of the business carries increasing capital and operational costs. Traditional build and manage-yourself models aren’t scalable, and not all workloads belong in the cloud, so strategically determining the right hybrid footprint for your needs is essential to long-term success.
  • Skills gaps. The more rapidly technology evolves, the greater the risk of a skills shortage. Not having the proper levels of expertise needed to mitigate data gravity barriers and exploit new opportunities may limit IT’s ability to adapt and scale at the pace needed.
  • Optimized locations. Locating data adjacent to network ingress/egress points is a major factor in reducing latency.

The Impending Fate of the Traditional Data Center

According to Gartner, in 5 years, 80% of enterprises intend to close their traditional data centers. Today, that number is 10%. With this massive impending shift on the horizon, there’s significant aftermath to consider with the footprints needed to support this. It’s an ethical responsibility to approach this explosion of data in a sustainable way—e.g. when Digital Realty sourced 80MW of solar energy to Facebook to help with their commitment to renewable energy and reduce their carbon footprint—but it goes much deeper than one-off successes. Sustainability needs to be at the heart of strategy. By investing in purpose-built data centers, IT can accommodate the needs of data-intensive trends like artificial intelligence, blockchain, and IoT with an infrastructure designed to scale at will.

Additionally, digital transformation is forcing IT to re-architect towards a decentralized infrastructure. This decentralized setup removes data gravity barriers in order to accommodate distributed workflows which vary by participant, application, information and location specific needs. Flexibility is key in this thinking, the type of flexibility like you get with SaaS models.

With 84% of enterprises now using multi-cloud, planning how to efficiently manage resources and deployments is critical. IT leaders will look to lean more heavily on the experience of data center providers who lead in planning centers of data exchange zones—achieving a proper mix of on-premises and cloud-resident workloads—to solve data gravity challenges and ensure operational and economic excellence. Edge computing strategies will be needed to support evolving business requirements. There will need to be micro data centers, but there’s still a need for a centralized core for a subset of data to be processed—a “mothership” for example. And while core components of AI end up in the mothership, both on-premises infrastructure and public clouds are deployed to enable AI initiatives.

One could argue that the “death of the data center” prediction has been exhausted. Data centers are here to stay, but not in the traditional way we’ve come to know them. Data centers are going through their own digital transformation and being reimagined instead as centers of data exchange, which remove data gravity barriers to accommodate distributed workflows which vary by participant, application, information and location-specific needs.

To weather this perfect storm of digital data, IT leaders must adopt a strategy that implements local network ingress/egress within proximity to centers of data exchange, interconnected to digital ecosystems and tailored to business needs. Indeed, as the storm intensifies, forward-thinking IT leaders are staying on their toes by diving head-first into IT architecture design.

Tony Bishop is SVP, Platform & Ecosystem Strategy at Digital Realty.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Vershinin89/Shutterstock.com
Source: Vershinin89/Shutterstock.com

Evaluating CDU Expansion Tank Technology: What Are the Most Important Factors?

Jeshwanth Kundem, Engineering Project Leader at nVent, outlines considerations when selecting expansion tanks for CDUs.

White Papers

Get the full report

The Affordable Microgrid: Securing Electric Reliability through Outsourcing

Feb. 12, 2022
Microgrids, which use controllers to connect multiple power generation and storage sources, can provide electric reliability but they can also be too complex and costly for businesses...