Evolving IT Strategies Require Flexible Architectures

March 25, 2022
Market dynamics demand that IT leaders evolve their data processing strategies with an eye toward more flexible architectures that move processing closer to the point of value. Learn more in this special report series courtesy of Belden.

Last week we launched a special report series exploring how high-speed fiber networks can future-proof distributed data centers, specifically looking at how data center growth created complexity. This week, we’ll look at how those market dynamics are driving IT leaders to evolve their data processing strategies to create flexible architectures.

Get the full report

Taken together, the market dynamics discussed in our last article demand that IT leaders evolve their data processing strategies with an eye toward more flexible architectures that move processing closer to the point of value. Future strategies are likely to include a combination of on-site infrastructure, public and private clouds, colocation services, edge data centers in the field or inside telecommunication carrier sites, and unattended smart devices.

The location and type of computing that is used will be driven by the characteristics of the workload. For example, applications like autonomous vehicles and streaming video delivery demand sub-five millisecond response times. In such cases, the processing is best distributed across multiple tiers with stream processing at the edge, a mid-tier control plane managing multiple devices and cloud servers aggregating and analyzing data at a high level.

This environment may include a combination of owned infrastructure, co-location services, carrier services, and public and private cloud. Operations and maintenance may be provided by dedicated staff and a network of service providers and contractors.

Other applications, such as real-time ad delivery and securities trading, require high-speed interconnection of the type provided by co-location services.

While traditional online transaction processing workloads are likely to remain in the data center or private cloud, data may also traverse the network for such uses as analytics, reporting, and sharing with business partners.

Networks will need to be segmented to allocate dedicated bandwidth to latency-sensitive processes and multitiered compute fabrics will be deployed based upon required response times, workload characteristics, security needs, and other factors.

While traditional online transaction processing workloads are likely to remain in the data center or private cloud, data may also traverse the network for such uses as analytics, reporting, and sharing with business partners. For companies that operate internationally, this will create new demands on network infrastructure as well as processing considerations driven by data sovereignty regulations and cost.

Download the full report, Future-Proofing the Distributed Data Center with High-Speed Fiber Networks, courtesy of Belden, to learn more. In our next article, we’ll explore three key considerations for IT leaders for future-proofing their infrastructure while creating those flexible architectures – resiliency, security, and performance. Catch up on the previous article here

About the Author

Paul Gillin

Paul Gillin is a speaker, writer and technology journalist who has written five books and more than 400 articles on the topic of social media and digital marketing. A technology journalist for 25 years, he has served as Enterprise Editor of the tech news site SiliconAngle since 2014. He was founding editor-in-chief of B2B technology publisher TechTarget, and served as editor-in-chief and executive editor of the technology weekly Computerworld.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

iStock, courtesy of AFL

Hyperscale: The AI Tsunami

AFL's Alan Keizer and Keith Sullivan explore how AI is driving change and creating challenges for data centers.

White Papers

Get the full report.
Get the full report.
Get the full report.
Get the full report.
Get the full report.

Northern Virginia Data Center Market

May 30, 2022
As developers seek to secure land for hyperscale operators looking to take advantage of Northern Virgina’s capacity to power cloud computing platforms and social networks, leasing...