Weathering a Perfect Data Storm with Centers of Data Exchange

Oct. 14, 2019
Tony Bishop, Senior Vice President, Platform & Ecosystem Strategy at Digital Realty, explores how to weather today’s perfect storm of digital data, IT leaders must adopt a strategy that implements local network ingress/egress within proximity to centers of data exchange, interconnected to digital ecosystems and tailored to business needs.

Tony Bishop, Senior Vice President, Platform & Ecosystem Strategy at Digital Realty, explores how to navigate todays explosion of digital data. Data centers are going through a digital transformation and being reimagined instead as ‘data exchange centers’, Bishop says. 

Tony Bishop, SVP, Platform & Ecosystem Strategy, Digital Realty

The Perfect storm of Digital Data

Everyone knows the best time to prepare for a storm is before it actually happens—but what happens when you end up right smack in the center? In a digital sense, we’ve been braving seismic gusts of disruption for decades. That’s the nature of the internet. Yet, what’s different now is the speed of technology innovation continues to accelerate at an alarming rate—creating tidal waves of data—and every company has specific nuances and risks exploited by data gravity that can’t be addressed with a one-size-fits-all cloud concept. As customer-centric enterprises find themselves in a vulnerable predicament, IT leaders aren’t asking “why?” or “how?” but proactively seeking strategies to weather this perfect data storm, remove data gravity barriers, and emerge with calm waters and clear skies.

Where is Data Taking Us?

As enterprises face an ever-increasing influx of data points, data gravity barriers will impede even the most mature IT infrastructure. Less than a decade ago Dave McCrory introduced data gravity, the “idea that data and applications are attracted to each other, similar to the attraction between objects that is explained by the Law of Gravity. As data sets grow larger and larger, they become harder to move…it’s the gravity that moves to where the data resides.” While opportunities with using data seem endless, it’s important not to underestimate the potential downside of data gravity barriers, which may prevent IT from accommodating global workflows in order to address unique participants, applications, information, and location-specific needs. The result includes increased difficulty, costs, and risks to enterprise IT infrastructure design.

  At the Center of Data Exchange

It’s expected that businesses depend on the direct, private, and secure exchange of data—but having the flexibility to respond rapidly to shifting markets and customer behavioral changes that drive critical demands from the business will prove paramount. If businesses don’t have the ability to scale-at-will, it can cost IT big time in terms of performance, cost management and security. Thus, challenges with legacy infrastructure arise when scaling to support global workflows.

As enterprises face an ever-increasing influx of data points, data gravity barriers will impede even the most mature IT infrastructure.

Current Risks that Impede IT

The ability to mitigate data gravity barriers is critical in order to enable global distributed workflows. And it’s not just about the growing data creation, usage and exchange, but also the challenges that arise from it. There’s a need to shift. Data processed outside of a traditional centralized hub will increase from 10% to 50% by 2022, according to Gartner. With all this data migrating, processing and moving, it’s critical that IT leaders prepare for challenges including:

  • Increasing costs.Building and maintaining data centers that can scale to the needs of the business carries increasing capital and operational costs. Traditional build and manage-yourself models aren’t scalable, and not all workloads belong in the cloud, so strategically determining the right hybrid footprint for your needs is essential to long-term success.
  • Skills gaps. The more rapidly technology evolves, the greater the risk of a skills shortage. Not having the proper levels of expertise needed to mitigate data gravity barriers and exploit new opportunities may limit IT’s ability to adapt and scale at the pace needed.
  • Optimized locations. Locating data adjacent to network ingress/egress points is a major factor in reducing latency.

The Impending Fate of the Traditional Data Center

According to Gartner, in 5 years, 80% of enterprises intend to close their traditional data centers. Today, that number is 10%. With this massive impending shift on the horizon, there’s significant aftermath to consider with the footprints needed to support this. It’s an ethical responsibility to approach this explosion of data in a sustainable way—e.g. when Digital Realty sourced 80MW of solar energy to Facebook to help with their commitment to renewable energy and reduce their carbon footprint—but it goes much deeper than one-off successes. Sustainability needs to be at the heart of strategy. By investing in purpose-built data centers, IT can accommodate the needs of data-intensive trends like artificial intelligence, blockchain, and IoT with an infrastructure designed to scale at will.

Additionally, digital transformation is forcing IT to re-architect towards a decentralized infrastructure. This decentralized setup removes data gravity barriers in order to accommodate distributed workflows which vary by participant, application, information and location specific needs. Flexibility is key in this thinking, the type of flexibility like you get with SaaS models.

With 84% of enterprises now using multi-cloud, planning how to efficiently manage resources and deployments is critical. IT leaders will look to lean more heavily on the experience of data center providers who lead in planning centers of data exchange zones—achieving a proper mix of on-premises and cloud-resident workloads—to solve data gravity challenges and ensure operational and economic excellence. Edge computing strategies will be needed to support evolving business requirements. There will need to be micro data centers, but there’s still a need for a centralized core for a subset of data to be processed—a “mothership” for example. And while core components of AI end up in the mothership, both on-premises infrastructure and public clouds are deployed to enable AI initiatives.

One could argue that the “death of the data center” prediction has been exhausted. Data centers are here to stay, but not in the traditional way we’ve come to know them. Data centers are going through their own digital transformation and being reimagined instead as centers of data exchange, which remove data gravity barriers to accommodate distributed workflows which vary by participant, application, information and location-specific needs.

To weather this perfect storm of digital data, IT leaders must adopt a strategy that implements local network ingress/egress within proximity to centers of data exchange, interconnected to digital ecosystems and tailored to business needs. Indeed, as the storm intensifies, forward-thinking IT leaders are staying on their toes by diving head-first into IT architecture design.

Tony Bishop is SVP, Platform & Ecosystem Strategy at Digital Realty.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

SeventyFour / Shutterstock.com

Improve Data Center Efficiency with Advanced Monitoring and Calculated Points

Max Hamner, Research and Development Engineer at Modius, explains how using calculated points adds up to a superior experience for the DCIM user.

White Papers

Dcf Techno Guard Sr Cover2022 06 15 16 38 10 232x300

High Density IT Cooling

June 21, 2022
Over the last ten years, power density in data centers has been on the rise, leaving many to wonder if the days of air cooling IT equipment are over. This report, courtesy of ...