Seven Reasons Broadcasters are Flipping the Data Pipeline on its Head

Seven Reasons Broadcasters are Flipping the Data Pipeline on its Head

Film, broadcast and CDN media workflows are incredibly complex. And they are getting even more so. With higher resolutions, greater ratios, and increased number of deployed cameras, the amount of data broadcasters are handling is growing at an explosive rate. Add to that, the multitude of formats required for distribution, on-demand delivery, and data retention requirements – you’re starting to get a picture of this beastly workflow!

Until today, it was customary that each step of the production and distribution pipeline used separate, specialized hardware (particularly NAS systems). But this model is growing too costly and complex to manage at scale.

Media companies are looking for new ways to simplify the broadcast workflow to ease operational management, reduce costs, and open up new revenue streams.

A New Player in the Workflow

As file systems struggle at massive scale, object storage has become a go-to solution for organizations dealing with large, distributed and massive files. Object storage is a great fit for digital media because of the type of data stored, how it is accessed and the type of applications used to transform it.

You are probably thinking of object storage only as a cheap and deep archive. Certainly, object storage was developed to support cloud architectures at the lowest cost per terabyte, and cold storage services are generally not synonymous with high performance. Yet object storage on-premises has evolved to deliver performance levels further up the storage tier without losing any of the benefits such as cost-effectiveness, ease of management and massive scalability.

Flipping the Broadcast Data Pipeline on its Head

When we test Western Digital ActiveScale™ cloud object storage systems, we find they can deliver much better performance than customers are accustomed to from public cloud-based solutions that allow workflows further up the storage tier. This level of performance can serve as a great platform for staging, archive and other tier-2 workloads. This opens the ability to consolidate multiple workloads onto a single, petabyte-scale system.

Yet perhaps the most interesting part of this development is how the data flow changes.

Suddenly staging and archiving happen at the same point. The two polar ends of the workflow merge into one. The pipeline is turned on its head, with multiple, diverse workloads running from a single platform, with extreme reliability.

flipping broadcast data pipeline

So why are broadcasters doing this?  Here are seven reasons broadcast companies are flipping the data pipeline on its head:

1. Massively Reduce Data Movement

The volume of data is exploding. Whether it’s HDR, higher shooting ratios or the abundance of formats required for end user streaming devices, the amount of both raw and compressed data that is created and needs to be retained is growing at unprecedented rates.

The challenge with the current model is that workflows require moving data between systems, formats, platforms and locations. This is extremely complicated from a management perspective and is a very costly operation.

Using a low cost, higher performance, petabyte-scale solution such as the ActiveScale cloud object storage system up front, as the first bucket, could potentially allow organizations to:

  • Archive raw footage as the first step, without having to move massive raw data through the pipeline.
  • Perform tier-2 workloads such as transcoding, captioning or color, from the same system without having to move the data.
  • Use very high-performance flash-based storage (such as Intelliflash) only for performance intensive operations such as rendering or analytics. You pull only what you need to expensive storage.

2. Go Lean

Simplifying the workflow and consolidating hardware means you can run a lean organization, lowering costs for hardware and support, reducing management time, putting an end to overhead and over buying, and eliminating rigid and costly legacy systems.

3. Use API from the Get-Go

One of the advantages of using object storage is that it uses API, and thus can serve as a perfect origin point for transcoding in a CDN or other distribution models.

Furthermore, cloud object storage systems like ActiveScale use S3 protocol for seamless connectivity to the public cloud. Together with Unified Data Access, an NFS interface for ingest and management of data, you can utilize the same storage environment for mixed file and object use cases.

4. Multiple Geo-Locations Under a Single Namespace

ActiveScale was architected for multi-geo deployment for added resiliency. In this configuration, systems share a single namespace for data across different locations.

For broadcast workflows, even if ingest and transcoding happen at different physical locations, the data is shared among the systems under one single namespace. This creates a far more simplified workflow for networks that distribute content across multiple geographic locations.

5. DR Made (Really) Easy

When using a multi-geo deployment, data recovery (DR) becomes extremely simplified when using an object storage system. Asset recovery is managed by simply pointing the directory to a different address. Unlike current recovery that requires retrieval, this can be a non-disruptive part of the media workflow.

6. Focus on the Last Mile

Solutions such as ActiveScale are plug and play, easy to implement and their management is not high touch. In other words – it simply works. That means teams can now focus their attention where it is most important – the last mile and your viewers’ experience.

7. New Revenue Opportunities

In today’s market, broadcasters need to find new opportunities to deliver content experiences and stay competitive. Adopting a massively scalable solution at low cost that can deliver adequate performance was not possible until today.  Suddenly, not only can digital media files be retained for longer periods (or forever), but they are also always accessible and online. This opens a whole new array of opportunities for services around archive, on-demand viewing, and repurposing and licensing of existing content.

Simplifying the Broadcast Workflow

Object storage is no longer just the bucket at the end of the workflow. It’s a high volume streaming storage solution that can deliver performance requirements for tier-2 workflows. Its ability to replace existing solutions, particularly for streaming workloads, is changing the broadcast pipeline from beginning to end, and flipping it on its head. Archive is no longer a dormant end but a first step in the new digital strategy.

Western Digital uniquely delivers solutions for every step of the workflow – from capture to creation to post-production, streaming, and archiving. Our family of trusted brands includes Western Digital®, G-Technology™, SanDisk®, WD® and Upthere™. Get to know our experts and let us help make your data thrive.

Learn More:

Erik Weaver: Erik Weaver is the Global Director of Media and Entertainment Market Development at Western Digital specializing in cloud technologies.

Related Posts

3 Things We Learned About Data Quality
3 Things We Learned About Data Quality

Western Digital built a top-tier big data platform for our manufacturing business. Here are 3 key lessons we learned about data quality for analytics and its impact on business.

The Demise of the Centralized Data Center
The Demise of the Centralized Data Center

The centralized data center model is increasingly outmoded as new types of data, such as sensor, log, IoT, audio, video and edge architectures provide the opportunity for improving operations, customer interaction and market understanding.