Modernizing Data Management in the Media & Entertainment Industry

Aug. 2, 2018
In this edition of Voices of the Industry, Maile Kaiser, Vice President of Sales & Business Development for CoreSite in Los Angeles, explores the modernization of data management in the media and entertainment industry. 

In this edition of Voices of the Industry, Maile Kaiser, Vice President of Sales & Business Development for CoreSite in Los Angeles, explores the modernization of data management in the media and entertainment industry. 

Maile Kaiser, Vice President of Sales & Business Development, CoreSite

The average American adult now watches 36 hours of television per week and more video content is uploaded in 30 days than the major US TV networks have created in 30 years.

But that’s just the tip of the iceberg. Between smartphone use, social media, and the proliferation of streaming services, the demand for digital content is multiplying exponentially. According to estimates, the mobile and online entertainment industry will top $300 billion by 2019 resulting in nearly five times the digital storage capacity needed by 2020. This means that in less than two short years the fruits of all this digital growth will somehow have to be contained—and maintained—or else.

But even though data storage, management and content delivery may be a top priority for many media and entertainment companies who are scrambling to accommodate the digital market, there is still a hesitancy to leave behind traditional on-premises IT and infrastructure—even with its space constraints, cost concerns, and personnel requirements—in favor of adopting a more modern data solution like colocation.

Why is that?

The good ol’ days of data 

It used to be that if you worked in media or entertainment, you handled data the old-fashioned way. Got a huge audio or video file to transfer? Just burn it to a disc, FedEx it to the production team in another location, and get on with your day. Except that whoever was on the receiving end of the file would then have to download the footage, upload it to their editing tools, and compile it manually. Workplace nostalgia aside, the process on either end was usually arduous and unwieldy and wasted more time than many production teams could afford to.

And there were more challenges.

As the velocity of digital content began to increase, so did the amount of space and resources needed to handle the extra computing power, including for capabilities like CGI and special effects. At the same time, data analytics was becoming so ubiquitous, so granular, and so crucial to business intelligence that to capture and analyze every keystroke required more servers, greater security measures, and even more resources just to properly lock down, store, and parse the information.

Shipping a disc—or a whole hard drive for that matter—may have been standard operating procedure, but it no longer met the standards of cost-effectiveness and efficiency that companies had to reach in order to stay innovative and competitive. And in today’s entertainment landscape of on-demand, any-screen high-def content, waiting for a physical data package to arrive is no longer a viable option.

With massive file sizes to grapple with, data silos to navigate, and equipment investments to justify, the expense of moving to digital can be prohibitive and the process itself overwhelming.

As times changed, however, the ability to transfer data digitally with more ease and flexibility became possible. Cloud computing increased both speed and network bandwidth and could be scaled up or down to meet changing needs.

The only problem is, digitally transforming a business overnight isn’t that easy. With massive file sizes to grapple with, data silos to navigate, and equipment investments to justify, the expense of moving to digital can be prohibitive and the process itself overwhelming. While going fully digital is the future goal for many companies, others have to worry about today. The show must go on and upending current application storage to a cloud-based system without a strategic plan in place can backfire.

This begs the question: can colocation effectively bridge the gap between the known quantities of yesterday and the rapidly expanding needs of today?

Meeting today’s big data demands

A big part of the answer lies with those massive files themselves.

Colocation is a much more effective way of not only sharing huge data files by uploading them over faster connections to a central repository to be easily accessed by anyone with the right permissions, but it also provides various platforms for creation and innovation. A data center can spin up new servers faster, giving production teams added compute capacity so they can use more tools more ways to meet the ever-growing demands of end users.

Additionally, colocation improves the speed at which large media files are transferred and processed and the rate at which content is delivered, reaping benefits for companies who are competing with the speed of other content delivery networks. In fact, by 2019, 72% of all internet video traffic will cross over CDNs, which puts enormous emphasis on production capacity and efficiency. Not to mention – if you are directly connecting to those CDNs, as well as a rich ecosystem of network and cloud provider partners within the facility, you are saving on bandwidth and network connectivity.

It’s not just digital content creation that drives businesses to colocation. Data centers offer room to grow, eliminating many of the space constraints companies face when housing their own data onsite and providing essential solutions as an element of a hybrid data management strategy that includes both on-premises and cloud deployments. Colocation also reduces the heavy costs associated with the purchase and maintenance of hardware, software, and other equipment and the number of IT staff needed to manage the infrastructure.

And then there’s security. Data centers offer 24x7x365 security officers, perimeter fencing, biometrics, and key cards—capabilities that most M&E companies just don’t have the ability to obtain and operate in-house.

What makes faster speeds, cost reduction, and lower latency possible are the direct connections to other environments, both public and private, located in the same data center, along with the ability to create a hybrid environment (a combination of on-premise application storage plus the cloud), which is quickly becoming an interim step for companies ready to move away from legacy IT and toward cloud-based products.

In essence, colocation makes it possible to cross-connect from one system or environment to another without a middleman, and to take advantage of flexible, secure cloud services without having to move all data and workloads.

Prepping for the future of digital content

The digital age we’re living through has opened doors to wider audiences, richer connections, and greater speeds of information exchange, but it’s also spawned very real challenges in how to effectively create, manage, and store more content.

The M&E industry has to keep up with increasing demand, and to do so means outsourcing data management and storage to a physical data center that can handle everyday power and cooling considerations, reduce equipment and human resource costs, increase speed and bandwidth, ensure reliable connectivity, and—perhaps most importantly—provide cloud-based options for application storage and work processes, giving companies the best of both the data center and cloud worlds.

At a time when media and entertainment matters most, colocation powers and secures digital content creation and distribution at scale so M&E companies can stay at the forefront of trends and not lag behind.

Maile Kaiser is Vice President of Sales & Business Development for CoreSite. 

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

iStock, courtesy of AFL

Hyperscale: The AI Tsunami

AFL's Alan Keizer and Keith Sullivan explore how AI is driving change and creating challenges for data centers.

VIAVI Solutions
Image created by DALL-E 3, courtesy of EdgeConnex
Shutterstock, courtesy of BluePrint Supply Chain

White Papers

DCF media kit 2022
DCF media kit 2022
DCF media kit 2022
DCF media kit 2022
DCF media kit 2022

Data Center Frontier Media Kit

Oct. 16, 2021
Data Center Frontier is ideal for companies that want to be seen as a thought leader in the data center industry. The programs include opportunities to build awareness, submit...