Erik Otto, CEO, Mediaproxy
The content chain for production and distribution in the modern television-streaming world is more streamlined and technology-based than ever before. But, argues Erik Otto, chief executive of Mediaproxy, this overlooks the area of deep archiving, which, as broadcasters look to keep everything they transmit, is now more crucial than ever.
Television broadcasting has always relied on a sequence of production stages. In the earliest days of live transmission this was relatively straightforward, with programs going direct from the studio to TV sets in the viewers’ homes. The advent of telerecordings and then videotape brought more complexity to the distribution process, which was further augmented as time went on by both more signal feeds within broadcasts and a greater number of transmitted channels.
Digital technologies enabled further expansion, to the point the TV sector is at today where the need for greater structure in how programs are distributed and stored has resulted in the digital media supply chain. Implemented in different ways by individual broadcasters, the digital media supply chain plays a crucial role in not only moving and delivering material but also storing it throughout its lifecycle.
This last aspect is becoming increasingly important because broadcasters, streaming services and content owners now want to store everything indefinitely, with the added imperative of being able to find and retrieve a program or clip quickly and with as little effort as possible. The monitor, test and distribution aspects of the digital media supply chain are the main focus for a compliance logging and analysis technology developer like Mediaproxy. Our LogServer platform is a live recording and monitoring system used by leading broadcasters to ensure they comply with transmission and program quality regulations but we are now learning it is being used as part of deep archiving set-ups as well.
While we did not set out to design a library asset management system, we are accruing an archive due to broadcasters keeping the output streams recorded by LogServer beyond the retention period stipulated by regulators. The attitude appears to be “We just keep recording”, regardless of what is expected officially. As part of this, the overwhelming consensus is that people do not want to create massive storage silos on-prem but instead are moving assets into the cloud. What this creates is a hybrid scenario, something that is very much a buzzword today but which is becoming standard practice as broadcasters cherry-pick elements of different solutions to create the best archiving chain for their requirements. In such a situation, immediate compliance recording is handled by local on-prem retention while the deep archive sits in the cloud.
Doing this is much more practical and cost-effective today thanks to object-based storage. Many of the broadcasters and facilities that are now using this as a means of storing vast amounts of material are looking to our product, which they may already have as part of their playout or distribution set-up for compliance purposes, to be the front-end interface. We are often asked if our platform is able to operate with a specific provider or storage solution, to which the answer is “yes” because we take an agnostic position and do not endorse any particular system.
For the user, the benefit of long-term, object-based storage working with a familiar control unit is that they can conveniently access media from a year or, potentially, years ago through a web interface or even from mobile devices. This allows them to review and search the content, enabling it to be retrieved and used for a whole variety of productions and applications. These include news pieces, highlights reels and obituaries. Increasingly, we are also seeing archive clips being posted on a broadcaster’s social media feed to coincide with what day it happens to be. Posting some historical content quickly on to Twitter, Facebook or a website is very easy these days and helps maintain a media organization’s profile. It also allows a broadcaster to engage with its audiences, as people comment on the clips.
Some broadcasters are still heavily criticized for their archiving policies – or lack of them – during the 1960s and 70s. Many episodes of programs from those times, which are now considered to be classics, are missing from the archives. The situation has improved considerably since then, with broadcast and production companies routinely retaining either the masters or copies of shows they have made or transmitted.
There are still some gaps, however, when it comes to other forms of programming, live news being a prime example. Traditional libraries will retain whatever has been ingested into the system, but this has not always applied to news and other as-broadcast material. The fact that important stories or records of significant events might not be saved for posterity has been recognized in some countries that we are already supporting with those features today.
At one time this would have been a mammoth task, involving hundreds of hours of videotape for initial capture and then either tape robots or spinning disks for long-term storage. On top of which, cataloguing everything so items could be found at a later time would have been an equally daunting prospect. Modern techniques and technologies have simplified such an operation considerably, with the on-air output recorded via a compliance system and then transferred to the cloud for deep archiving. Finding a specific item is relatively straightforward, using a simple date and time search.
Realistically, it might not be always possible to have every second of every broadcast but through the implementation of hybrid compliance-cloud storage set-ups, broadcasters now have the best chance of delivering on the promise of an archive that is as complete as possible.