VisualOn – Produce High-Quality Video Remotely

Remote production has revolutionized the way video content is created and delivered in the modern era of digital streaming. With advancements in technology and connectivity, the concept of remote production has gained immense importance and popularity. By enabling teams to collaborate and produce high-quality video content from various locations, remote production offers unparalleled flexibility, cost efficiency, and scalability. From live events and sports broadcasts to film and television productions, remote production has emerged as a game-changing solution that maximizes efficiency, reduces expenses, and ensures seamless content delivery to audiences worldwide.

5G technology offers broadcasters unparalleled opportunities to deliver high-quality video content. By providing greater bandwidth, lower latency, and improved quality of service, 5G has made it possible for broadcasters to take their remote production capabilities to the next level. However, deploying and maintaining these services is still a challenging task that requires significant testing, troubleshooting, and debugging to identify playback issues and qualify devices and streams.

VisualOn, a leading provider of video streaming solutions with a global presence in research and development, operations, and customer support, understands the complexities involved in delivering high-quality video services. The process of assembling the necessary hardware, streams, and personnel in one location can be expensive and time-consuming. To address this, VisualOn has introduced the Remote Lab project, which is designed to enable remote teams to access physical devices with ease. This project combines software and hardware solutions to provide optimized audio, video, and multimedia applications.

The Remote Lab project solves the challenges associated with remote production by providing a streamlined solution for managing disparate teams, devices, and content. With ultra-low latency, this solution ensures that streaming service providers can offer a consistent quality of service across all viewing devices. This is particularly important given the growing demand for high-quality video content across a range of platforms, including smartphones, tablets, and smart TVs.

Moreover, with Remote Lab, streaming service providers can ensure that their content is tested and verified before it is released to viewers. This helps to prevent playback issues and ensures that the content meets the high standards of quality that viewers expect. Additionally, Remote Lab provides an efficient solution for training remote teams, enabling them to work together seamlessly and deliver exceptional results.

VisualOn Remote Lab allows for real-time remote testing and debugging, in-house research and development, and video quality assurance. It offers a single web page access to video feed and control, low latency configurations, immediate Android ADB execution with video feed, and full remote control functionality. Additionally, it supports mouse and keyboard controls, recording and playback of tests, and full-screen viewing. Additional benefits include:

  • Real-time Remote Test and Debug. Bring together distinct teams, devices, and streams to find and resolve playback issues at high frame rates.
  • In-house Research and Development Center. Leverage remote resources to develop and test new services with effective cost
  • Remote Control Room. Set up separate lab instances in different locations to monitor from a central control room
  • Robotic Process Automation. Quick automated testing without needing to write testing scripts
  • Supports a Broad Spectrum of Video Platforms. TVOS and Android devices, set-top boxes, smart TVs, web browsers, and other streaming devices 

     

     

  • Using the mouse to scrub through the content in the real-time

    Remote Lab is a game-changing solution for remote production that streamlines the process of deploying and maintaining high-quality video services. By enabling remote teams to access physical devices easily, managing diverse teams, devices, content becomes effortless, and streaming service providers can offer a consistent quality of service across all viewing devices.

    Check our latest interview with IABM TV. This session discusses the innovative solution known as VisualOn Remote Lab, highlighting its benefits for OTT video service providers during this period of increased video consumption. IABM TV Interview>

Simplestream – Live events orchestration: why a flexible CMS solution is integral to business success

Adam Smith, Founder & CEO, Simplestream

What are the biggest challenges in managing live event schedules with content provided by multiple operators? The answer can be rather straightforward, and it comprises several key aspects, mostly related to the pain points platform owners are facing today when distributing content across digital channels.

Firstly, ensuring that the live content ingestion and integration are handled with the right set of tools is the foundation for granting flexibility when delivering multiple outputs. In a workflow that in normal circumstances heavily relies on manual processes, with several teams involved, quality and reliability of content provided by different operators need to be ensured throughout. And ultimately, scheduling and distribution – across different touch points – are a focus for successful platforms. With often large numbers of stakeholders deployed to manage the workflow, a permission-based system becomes necessary to support seamless operation, avoiding jeopardising the overall effectiveness of the platform.

Media Manager for live events orchestration

Can a backend solution with an easy-to-navigate UI make life easier? It certainly can, and that’s the reason why at Simplestream we built our value proposition focusing on the foundation of a flexible, modular content management system, Media Manager. A product that has progressively become more powerful for operators to own the entire workflow for their video content, both on demand and live.

Its most recent evolution, Live Scheduler, is a bespoke enhancement for the built-in module for live events and has marked a new milestone in solving the challenges outlined above. It has adapted Media Manager to an even more flexible architecture, one that can also act as an orchestration layer, fully self-service for the client. It provides an opportunity for platform owners of any size to bring multiple event data management capabilities under the same roof. With additional benefits when it comes to monitoring, scheduling, and permissions for different teams.

 A use case in the telecommunications space

A telco based in the North American region chose Simplestream to enhance an existing set of OTT workflows, including all the origin servers, APIs, and encoders. With no existing UI, and a backend system relying solely on manual processes, the organisation was limited in the number of content partners it could onboard, and looked for a solution that could be cost-effective and ‘self-service’. The objective was to streamline an otherwise cumbersome (and prone to error) process of manually inputting the documentation to correctly set up live schedules for multiple events in sport, but also across faith, news, and entertainment in general.

Thanks to Live Scheduler, every content provider can access an easy-to-navigate platform, operating in a cloud-based, fully agnostic environment. The backend acts as middleware, with custom UI, and the added benefit of reducing the margin of error to a minimum, as well as allowing the client to limitlessly expand the number and scale of the content providers.

The added value naturally sits at the heart of the solution. On one hand, Media Manager acts as a bridge between the source of content and the output of the broadcast. On the other, teams of operators from third-party providers can seamlessly access the backend and see – at a glance – what the scheduling parameters are, ensuring a smooth operation. It was key for the client to be able to open a gateway for its content providers to fully manage their events (from their creation to updates, deletion and monitoring) against a pre-integrated API. The information is subsequently passed through the CDN for the final stages of the process before it gets delivered to IPTV or apps.

A permission-based structure provides the third parties with multiple tiers of access: from user groups auditing, to seamlessly monitoring the health and status of the service, with the ability to roll back to a previously auto-saved version of the schedule in case of mistakes.

On the horizon

Content aggregation is among the most debated topics in the streaming space today. The future might hold something big for platforms looking to simplify and accelerate content onboarding, becoming themselves an aggregator. The aforementioned solution for the orchestration of live events goes perfectly hand in hand with the needs and wants of big telcos, of course, but appeals to media brands or rights holders in the live news, sports, betting, and entertainment spaces too.

Picture this: an operator wants to promote an innovative service that relies on user-generated content to populate daily programming schedules. Media Manager and its Live Scheduler can become a powerful ally for content creators (regardless of their experience) who can access the backend from anywhere, holding basic camera equipment or even just a smartphone with a data connection. The solution would undoubtedly allow them to spin up a solution that’s cost-effective, suitable to be permanent or ‘event-basis’ (i.e. in the case of multi-sport events with a set timeframe, to be shut down as they end).

As a next step in the evolution of such a product, we naturally find virtual channels with optional monetisation opportunities via ad insertion. Simplestream’s Channel Studio can support more innovative approaches to a deeper user experience. With multiple live events happening in one or more days, is there a chance to bring everything together, in the same pre-built playlist of content? The answer, again, is yes. And with the invaluable benefit of content providers adding any video on demand (VOD) asset to fill potential downtime. All in a useful queue system to be approved by the platform operators at a higher level.

 

Net Insight – Taking control of IP: Putting security back at the heart of media

Kristian Mets, Head of Sales Business Development at Net Insight

The IP media paradigm is loud and clear, changing what we used to know about producing and distributing live events and how we did business in the media world. The innovation potential is immense, bringing efficiency and agility to the media industry at an unprecedented scale. However, transformation also needs to iron out some of the changes it brings. Moving from closed and controlled to open IP-based workflows means network control and security become mission-critical capabilities as media companies need to ensure their high-value content is protected.

When it comes to premium content, any mistake or network vulnerability can prove detrimental, both financially and reputationally. There is no room for compromise —  media companies need to boost their network control and security to ensure they make the most of IP without caveats.

Closing the closed media interface chapter

SDI has been a big chapter in media, powering the secure transport of uncompressed and unencrypted video feeds.  However, closed interfaces can no longer meet the requirements of today’s complex and content-hungry media landscape. Consumers are after higher-quality viewing experiences, including rich and high bandwidth UHD-4K and UHD-8K video formats. The transition to flexible and scalable workflows is not optional but a driver of competitive advantages that can define the winners in a fast-evolving media market.

IP isn’t purpose-built for media. This means that media companies need the right technology solutions to ensure the control, security, and quality of high-value content. Using IP to transmit video, audio, and data means entering different domains and network links and ports. All these entry and exit points need to be controlled to ensure the right type of IP media traffic can pass through the networks and the type of streams that can go in and out of each network domain.

Due to its nature, IP media is vulnerable to various types of security risks, including ‘internal’ and ‘external’ threats. Even ‘secure’ IP media traffic can cause serious network problems. For instance, If the content isn’t configured correctly, it can flood the network and cause packet loss, jitter, and delay. At the same time, there is always the risk of ‘human error’. For example, a camera setting could be misconfigured, and the connection mistakenly defined as high-definition 4K. That would mean the IP media flow going into a switch could risk the entire event network going down.

These errors can jeopardize the live broadcasting of high-value content, leading to poor viewing experiences for consumers and revenue loss for media companies. Media organizations need complete visibility and control of their IP media traffic to enhance the security of their most valuable content.

Media-native security without compromises

While network control and security are key, there are different routes that media companies consider. However, not all security solutions can provide the benefits media organizations need. For instance, generic enterprise IT firewalls and solutions can’t meet the reliability, high-bandwidth, and low-latency requirements of any organization moving video over IP. Additionally, because of IP media networks’ stringent bandwidth, latency, and robustness requirements, enterprise firewalls dramatically increase costs. Media companies would need more expensive firewalls to secure high-bandwidth networks, which could lower the quality of video streams, deteriorating the overall user experience.

This is why the media industry needs ‘media-native’ security solutions. A key alternative network security model leverages the reliability and capabilities of a Real-Time Transport Protocol (RTP) media proxy that can deliver both high security and super high-quality video.

The RTP media proxy would terminate a media flow at the network boundary and re-establish it at the destination network without disrupting other active IP media flows. In this way, it boosts the security of the overall media network by preventing outages and security risks, including hijacking and spoofing. At the same time, the RTP media proxy preserves the integrity of established flows. The solution enables the monitoring and assurance of the IP media payload as it passes between networks with ETR 101/290 P1 performance metrics, frozen frames, and audio silence. Media companies can enjoy the full protection of their IP media network without risking the quality of the video signals transported.

In addition, media organizations can drive cost efficiencies as these media-native solutions minimize costs compared to traditional IT firewalls and are easy and seamless for media companies to deploy.

 Security through a ‘media-first’ lens

 IP delivers the innovation the media industry needs to push boundaries and thrive — from unprecedented agility and scalability to new revenue streams. However, open interfaces require a security layer to ensure media networks and high-value content are protected and controlled to the highest standards.

Due to the nature of the media industry, media technology often needs a different approach to meet industry standards and Service Level Agreements (SLAs), particularly around the high-bandwidth and low-latency requirements.

Traditional enterprise IT firewalls are not fit for purpose and can’t deliver seamless, high-quality IP transmission, raising infrastructure costs and deteriorating the broadcasting-grade quality of video feeds. On the other hand, media-native solutions like RTP media proxies are born out of media needs and can deliver the security model of the IP era.

Media companies need to be strategic about their control and security needs. A ‘media-first’ approach removes the challenges of IP adoption. With security challenges resolved, media organizations can focus on what they do best — delivering exceptional and exciting viewing experiences.

Perifery – Bringing AI innovation to the Edge

Jonathan Morgan, Product and Technology, Perifery

The media industry is experiencing the transformative impact of AI and ML technologies. These innovations have revolutionised various aspects of content creation, distribution, marketing, and monetization.

AI on entertainment platforms has led to a host of benefits, including data-driven enhanced efficiency, better personalization, and more informed program and content decision-making capabilities. AI in media production and post production has enhanced light ray rendering capabilities and can even edit the production using prescribed user preferences. In sports, AI editing can go as far as making whole game highlight reels. In archive semantic AI can discover scenes with car chases or even romantic scenes. There is no longer a debate about whether AI will happen; it is here, and it is here to stay.

Putting the User at the Heart of Decisions

By analysing viewer behaviour and demographics, entertainment platforms can leverage data to inform how they plan original production and content creation. They can tie these insights into their marketing strategies, and target audiences much more accurately. With a better understanding of customer preferences and optimised content delivery platforms, not only does revenue generation improve, but the audience also benefits from a better service.

Tracking viewer analytics and engagement, means companies gain even deeper insights into audience behaviour, preferences, and sentiment analysis. Personalised user experiences can focus on individual patterns, with content recommendations, targeted advertising, and tailored experiences. This customisation enhances user satisfaction, and inspires feelings of loyalty, which will in return reduce platform churn.

The Important Work Behind the Scenes

So, with AI and ML algorithms increasingly being utilised to produce, curate, and optimise content on platforms, where does storage and asset management fit in? Brand new, original productions are certainly a big draw for viewers. But there is also a huge selection of big-ticket archive content that can help media companies increase market share in a very competitive landscape. However, the challenge comes when the internal resources needed to edit and process that content outweigh its commercial value.

Content owners could have years’ worth of jaw-dropping nature documentary footage that has never been appropriately tagged with the corresponding metadata. It might be visually stunning, but if editors and post-production teams need to spend hours scrubbing through footage to find specific clips, then it’s not practical to repurpose it. With newer content, the metadata might be assigned, but how useful is it for every instance? If the original content was tagged with basic information such as the season, episode, and handful of keywords, then marketing teams will have a very tough time collating clips. A promotional team’s needs are totally different: they might require a collection of hilarious one-liners from a specific character to promote a new series acquisition and an AI generated highlight reel might be required to fit the need.

AI and Automation in Media Workflows

Automated workflows, intelligent content tagging, and video editing tools powered by AI can significantly speed-up post-production timelines, optimise resource allocation, and lower costs. Content creation is ultimately a mix of both repetitive and creative tasks. Wherever possible, the vendor ecosystem should be looking to reduce mundane actions and leave media professionals with more time for creative decision-making and collaboration. So, with AI and ML technologies transforming the media industry is there a downside?

Unfortunately, the reliance on public cloud means that upload times, security and processing in the cloud fees have had a negative impact, because it limits what work can be done. “Data gravity” argues that it is far less costly and time consuming to bring the application to where the data is, rather than moving large video footage around. From a production perspective much can be gained from processing content closer to the dispersed locations where it is generated and potentially reducing the quantity and increasing the quality of the content that may eventually be placed into a centralised system or into cloud. In post-production, teams may need to access and collaborate on content, both onsite and remote, from anywhere in the world.

Moving AI Beyond Cloud Boundaries

Many AI-embedded products are cloud native, so this has meant that entertainment providers have been stung by unexpected public cloud costs and astronomical processing and egress fees. Virtually unpredictable expenses and complex devops processes mean it’s totally understandable that some companies are hesitant to use AI or have stopped using it altogether. While AI services have focused on providing content creators and owners with the ability to use embedded products in the cloud, edge computing has been underutilised.

Edge computing offers improved security and added efficiencies for media and entertainment use cases. Processing data at the edge means valuable content can be securely stored and accessed locally, reducing the risk of data breaches or unauthorised access. Entertainment providers need streamlined workflows that help them save time and bandwidth requirements, and this is where intelligent edge workflows excel. By splitting processing between the cloud and the edge, media teams can take the path of least resistance. Workflows happen wherever they make the most sense. Instead of assigning significant financial resources to processing all content in the cloud, media companies can now perform many preprocess functions at the edge.

AI and the Edge – The Best of Both Worlds

The applications for AI-enabled functions in M&E are extensive. AI can analyse, categorise, and tag files, based on the content, metadata, and context. This ensures faster, more efficient retrieval and better management of large volumes of media. AI-powered algorithms can quickly identify objects, scenes, faces and text within media files, facilitating accurate content indexing. With automated descriptive metadata generation for media files, AI algorithms can extract relevant information such as object identification, scene description, location and timestamps.

By handling AI at the edge we can reduce the load on central or cloud storage by optimising file formats, performing rough editing and extracting information that is required immediately. AIOps can further enhance IT workflows by allowing natural language to be utilised alongside AI enhanced predictive data movements. Meanwhile, cloud AI can be used to run algorithms that are not yet moved out to the edge, or on data that has already been moved from the edge devices without pre-processing, such as on an historical archive.

A Perfect Partnership

Edge-based media content production is the ideal partner for AI. It enables easy execution of AI functions and ensures predictable costs. The edge enables more efficient resource utilisation, particularly in scenarios with fluctuating workloads, which is a key feature of the M&E industry.

Edge computing delivers both scalability and cost efficiency. By distributing computing resources across the network edge, content owners can dynamically scale their infrastructure based on requirements. By splitting processes between the cloud and the edge, media companies can leverage fast AI-enabled pre-processing, both onsite and remote. The latest developments in AI are hugely impressive, but they need the right infrastructure environment to perform optimally and really streamline workflows. After all, an AI house can only be as strong as its foundations.

Witbe – Why Smart TV Apps Need Automated Testing and Monitoring

Mathieu Planche, CEO at Witbe

Viewers today have more on-demand content available to watch than ever before, and more ways to watch it as well. In the past, families crowded around a television to watch whatever was on. Now, if someone wants to enjoy a movie on Netflix, they can do so on their laptop, phone, tablet, gaming console, traditional set-top box (STB), or an OTT device, like Apple TV, Amazon Fire Stick, or Google Chromecast.

Yet the television still remains an important part of viewing experiences today. Smart TVs that connect to the internet and allow users to watch streaming apps are overtaking traditional TVs. Hub Entertainment Research Group reported in April 2023 that 204 million American households now own a Smart TV, breaking the 200 million threshold for the first time ever. The study found that eight out of every 10 TV-owning households own a Smart TV specifically.

Smart TVs eliminate the need to connect a STB or OTT device to stream content, even though there may be user benefits to doing so. The simplest tech solution is often the one that users will choose. If a Netflix app is available to watch directly on their television, most users will opt for watching that instead of connecting a different device to their Smart TV just to achieve the same result. However, Smart TVs can introduce a few challenges to video streaming.

Challenges of Streaming Apps on Smart TVs

For video service providers, Smart TV apps pose a unique challenge. Since they are so immediately accessible for viewers, they are bound to be popular. Users expect the same performance and experience they’ve grown used to on mobile platforms and STBs. With a wealth of content available, if one app crashes or delivers poor performance, viewers are quick to switch to a competitor’s app instead.

Complicating things, video service providers cannot control the platform or device their Smart TV app is hosted on. Most Smart TVs feature their own unique operating system, meaning an app has to be developed to work specifically on each Smart TV.

Developing a capable Smart TV app is only the beginning. Smart TV manufacturers can and will push a new update at any time without giving video service providers a heads up. Smart TV apps may react differently to the new build than the previous one they were optimized for. Video service providers need to approach this challenge with constant monitoring and an ability to roll with the punches.

Another common issue with streaming content on Smart TVs is ensuring that content is properly indexed. When video service providers release a new show, is that show immediately searchable from the main Smart TV menu? Will the search results direct users to watch it on their service or a competitor’s? Does voice control search, a common feature in Smart TVs, yield the same results?

Indexing is an issue that impacts third-party content providers whose assets are available on other streaming services, as well as video service providers. Third-party content providers should check if their content is being delivered properly to these Smart TV apps, and then being indexed into the TV OS correctly. The performance of the services hosting their content reflects back on their brand reputation. As always, verification is key.

Why Testing and Monitoring is Critical

The only way to ensure strong performance for apps on Smart TVs is through testing and monitoring. With Smart TVs, video service providers should be monitoring their own app’s performance, the performance of other popular third-party apps, and how all their content is being integrated and indexed in the OS. This will ensure a functional ecosystem where the app can be properly received by viewers at home.

Measuring the quality of experience (QoE) that viewers receive on their Smart TV is essential for delivering consistently strong performance. In order to understand their true QoE, video service providers should be testing and monitoring on the same Smart TVs that users are watching apps on. In order to monitor the app performance that users are experiencing, the same devices and networks must be tested and monitored directly.

Of course, this is a tall order to achieve manually. Most testing teams don’t have the workforce to continually monitor performance 24 hours every single day, or search for every single content asset on the Smart TV OS and verify the results. When you consider that most Smart TVs require their own unique apps that need testing, and that each of those app builds will need to be tested again with every new OS update, it’s clear that some technological assistance is required.

This is why reliable test automation is essential for video service providers. It’s the most effective way to test and monitor app performance across updates, to verify that content is accessible through the TV OS, and to respond immediately when service errors occur. Automated testing and proactive monitoring are the best solutions to help video service providers understand their true QoE on Smart TVs.

Conclusion

For many viewers, the best way to watch their favorite content on a big screen is whichever way requires the least set-up. Despite all the features and effort put into OTT devices, plenty of users are happy just to watch the app already included on their Smart TV. Video service providers can deliver a superior QoE and fully integrated content by embracing automated testing and proactive monitoring. Since Smart TVs are the industry standard, investing in app performance is essential — and reliability and efficiency should be the ultimate goal.

Take 1 – Dubbing Hell’s Kitchen for a FAST channel: an innovative approach to localization

The FAST (Free ad-supported television) channel landscape has been growing at pace in the USA. The whole FAST industry generated $2.1 billion in the US in 2021 and is set to double that total in 2023.

It’s easy to understand why. From an audience perspective, in the USA, FAST is easily accessible and free at the point of use.  This offers an attractive alternative to cable subscriptions which heavily dominate this market. For content owners, FAST channels can reach new audiences and offer new monetization opportunities.

With widespread growth of FAST channels outside of the USA, content owners are starting to look for new international opportunities and establish FAST partnerships in Europe. Now, with Germany and Brazil predicted to be the next largest global FAST regions, some of these international opportunities require a new approach to dubbing and localization solutions.

A case in point is ITV Studio’s recent requirement to localize 18 seasons of Hell’s Kitchen USA for their FAST channel launch in Germany.

An international hit

Hell’s Kitchen USA is one of ITV Studio’s most successful shows, with 21 seasons airing between 2005 and 2023. Internationally, the series has been syndicated in 25 territories including France, Russia, Lithuania, Belgium, Italy and Thailand. In Germany, Sat.1 commissioned a local version of the format in 2014, produced by ITV Studios Germany and hosted by Michelin-starred chef Frank Rosin. According to Parrot Analytics, Hell’s Kitchen USA has a higher audience demand than 99.6% of all reality titles in Germany.

With these stats, it’s clear to see why Hell’s Kitchen USA was one of the flagship programmes selected for their German FAST channel launch. They needed a long-tail localization strategy with a flexible partner.

A FAST approach to localization

ITV Studios is partnering with Take 1 to provide the Hell’s Kitchen USA German dubbing for their new German FAST channel launch.  For this project, the Take 1 team is working collaboratively with ITV Studios to deliver high quality dubbing at a competitive price – using innovative and flexible workflows to drive down cost. For example: Take 1 secured ‘the German Gordon Ramsey’ as key voice talent for the dubbing project to maintain brand consistency in the region but is reducing talent costs by utilizing a small pool of actors to voice the full set of contestants across the seasons.

Similarly, the Take 1 team is taking advantage of the fact that voice-over dubbing is quite common in Germany. Unlike lip-synch dubbing, this technique keeps the original-language dialogue at very low levels under the translated voice-overs and requires less studio time for recording. In addition to dubbing services, Take 1 also provides forced-narrative subtitles to translate on-screen graphics or written content that isn’t explained in the narrative.

One area where Take 1 has no room for compromise is quality control.  With FAST channel content often being drawn out of deep archives, where multiple versions of each episode may exist in a variety of formats, finding the correct version for a particular use can be challenging.  However, with quality control checkpoints throughout Take 1’s process of subtitling, scripting, dubbing, post and layback, any issues are picked up and rectified before delivery.

“A great first dub commission between ITV Studios and Take 1, to localize the hugely successful Hell’s Kitchen USA into German and release it into the FAST world. Tremendous attention to detail was needed along the way due to some challenging legacy distribution assets, which ensured the best possible version is released in the market,” Said Darren Summers, Localization Manager ITV Studios.

An ongoing project

With eighteen series to dub, localizing Hell’s Kitchen USA for ITV Studio’s FAST channel in Germany is ongoing.  No doubt, during this time we’ll see significant changes in the FAST channel approach as it matures. We’re excited to gain some early insights into how FAST is working internationally. It will be fascinating to learn which genres work well and which localization strategies are producing the highest ROIs. It’s clear that those working in the localization space will need to be disruptive and agile to meet these growing needs as the FAST sector evolves.

Get in touch to talk about localizing your FAST channel content.

Mediaproxy – Broadcasting forever

Erik Otto, CEO, Mediaproxy

The content chain for production and distribution in the modern television-streaming world is more streamlined and technology-based than ever before. But, argues Erik Otto, chief executive of Mediaproxy, this overlooks the area of deep archiving, which, as broadcasters look to keep everything they transmit, is now more crucial than ever.

Television broadcasting has always relied on a sequence of production stages. In the earliest days of live transmission this was relatively straightforward, with programs going direct from the studio to TV sets in the viewers’ homes. The advent of telerecordings and then videotape brought more complexity to the distribution process, which was further augmented as time went on by both more signal feeds within broadcasts and a greater number of transmitted channels.

Digital technologies enabled further expansion, to the point the TV sector is at today where the need for greater structure in how programs are distributed and stored has resulted in the digital media supply chain. Implemented in different ways by individual broadcasters, the digital media supply chain plays a crucial role in not only moving and delivering material but also storing it throughout its lifecycle.

This last aspect is becoming increasingly important because broadcasters, streaming services and content owners now want to store everything indefinitely, with the added imperative of being able to find and retrieve a program or clip quickly and with as little effort as possible. The monitor, test and distribution aspects of the digital media supply chain are the main focus for a compliance logging and analysis technology developer like Mediaproxy. Our LogServer platform is a live recording and monitoring system used by leading broadcasters to ensure they comply with transmission and program quality regulations but we are now learning it is being used as part of deep archiving set-ups as well.

While we did not set out to design a library asset management system, we are accruing an archive due to broadcasters keeping the output streams recorded by LogServer beyond the retention period stipulated by regulators. The attitude appears to be “We just keep recording”, regardless of what is expected officially. As part of this, the overwhelming consensus is that people do not want to create massive storage silos on-prem but instead are moving assets into the cloud. What this creates is a hybrid scenario, something that is very much a buzzword today but which is becoming standard practice as broadcasters cherry-pick elements of different solutions to create the best archiving chain for their requirements. In such a situation, immediate compliance recording is handled by local on-prem retention while the deep archive sits in the cloud.

Doing this is much more practical and cost-effective today thanks to object-based storage. Many of the broadcasters and facilities that are now using this as a means of storing vast amounts of material are looking to our product, which they may already have as part of their playout or distribution set-up for compliance purposes, to be the front-end interface. We are often asked if our platform is able to operate with a specific provider or storage solution, to which the answer is “yes” because we take an agnostic position and do not endorse any particular system.

For the user, the benefit of long-term, object-based storage working with a familiar control unit is that they can conveniently access media from a year or, potentially, years ago through a web interface or even from mobile devices. This allows them to review and search the content, enabling it to be retrieved and used for a whole variety of productions and applications. These include news pieces, highlights reels and obituaries. Increasingly, we are also seeing archive clips being posted on a broadcaster’s social media feed to coincide with what day it happens to be. Posting some historical content quickly on to Twitter, Facebook or a website is very easy these days and helps maintain a media organization’s profile. It also allows a broadcaster to engage with its audiences, as people comment on the clips.

Some broadcasters are still heavily criticized for their archiving policies – or lack of them – during the 1960s and 70s. Many episodes of programs from those times, which are now considered to be classics, are missing from the archives. The situation has improved considerably since then, with broadcast and production companies routinely retaining either the masters or copies of shows they have made or transmitted.

There are still some gaps, however, when it comes to other forms of programming, live news being a prime example. Traditional libraries will retain whatever has been ingested into the system, but this has not always applied to news and other as-broadcast material. The fact that important stories or records of significant events might not be saved for posterity has been recognized in some countries that we are already supporting with those features today.

At one time this would have been a mammoth task, involving hundreds of hours of videotape for initial capture and then either tape robots or spinning disks for long-term storage. On top of which, cataloguing everything so items could be found at a later time would have been an equally daunting prospect. Modern techniques and technologies have simplified such an operation considerably, with the on-air output recorded via a compliance system and then transferred to the cloud for deep archiving. Finding a specific item is relatively straightforward, using a simple date and time search.

Realistically, it might not be always possible to have every second of every broadcast but through the implementation of hybrid compliance-cloud storage set-ups, broadcasters now have the best chance of delivering on the promise of an archive that is as complete as possible.

Interra – Explosion of FAST Requires Three-Pronged Approach to Monitoring Ad Insertion

Anupama Anantharaman, Vice President, Product Management at Interra Systems

The Free Ad-supported Streaming TV (FAST) market is undergoing explosive growth, representing a significant revenue opportunity for service providers. A recent report from Omdia found that FAST revenue grew almost 20 times between 2019 and 2022, and is projected to triple between 2022 and 2027 to reach $12 billion.

Consumers are attracted to FAST channels because they are free to watch and provide a similar viewing experience to linear television. In exchange for free content, viewers are required watch video advertisements that typically run 15-30 seconds.

Recently, service providers have started exploring the benefits of delivering FAST channels with targeted ads. By delivering well-targeted ads, media owners can achieve a higher Cost per Mile/Thousand (CPMs), boosting their revenue. However, service providers must ensure a superior-quality streaming experience.

Challenges With Delivering Targeted Ads

The underlying technology that enables service providers to insert ads into FAST channels on the fly, is called dynamic ad insertion (DAI). DAI is used across platforms for linear broadcast, VOD, mobile, and OTT, enabling service providers to target specific platforms, devices, audiences, and geographies.

Implementing ads in a media streaming workflow can be challenging and complex. OTT service providers face multiplatform and multiscreen delivery issues. They must check for the presence of SCTE35 messages in the linear workflow, enable seamless ad insertion at segment boundaries, and assure correct alignment of ads across all profiles/variants. Furthermore, it’s important to assure a consistent ad insertion workflow and accurate ad-insertion at the correct position.

Three Strategies for Effectively Monitoring DAI

Having a comprehensive, three-pronged approach to monitoring DAI is the most effective way to isolate issues and ensure outstanding quality streaming experiences for FAST channels with targeted ads. The first step in the three-pronged approach involves deploying a real-time content monitoring system for linear streams. Deployed at the headend of the workflow for the linear streams, the monitoring system enables service providers to deliver error-free, superior-quality video by performing ad insertion monitoring, including SCTE-35 cue messages and post-insertion verification for regulatory compliance.

Ad insertion monitoring in Interra Systems’ ORION-OTT

Key ad-insertion monitoring features to look for include SCTE-35 compliance checking, checking for the absence and presence of SCTE-35 messages, DPI pair monitoring for pre-insertion and post-insertion feeds, and ad quality analysis.

The second step is using an OTT monitoring system for adaptive bitrate (ABR) content. By deploying an OTT monitoring solution service providers can monitor ABR content integrity and related network performance in an OTT environment. It’s important to choose a system that monitors both VOD and live ABR content, as most FAST channels offer a variety of video including live sports, movies, and television series. Errors can occur when delivering FAST channels due to inconsistencies pertaining to ABR package compliance, manifest and playlist syntax, download errors, and content quality issues.

OTT service providers need to monitor ad markers from the source to ABR packaged outputs at the origin stage to ensure that all avail opportunities in the upstream signal are correctly propagated downstream. Since SSAI happens post ABR packaging, any missed avail (that was present in the source stream but was not properly translated to ABR manifests) will impact revenue opportunities.

Ad monitoring checks need to be performed on a continuous basis. Some of the checks that service providers can execute with an OTT monitoring system include looking for the presence of respective ad markers in MPD or HLS playlists and out time, in time, ad duration; inconsistent ad start segments across variants; duration of ads; minimum and maximum duration of content between ads; download failure or delay for ad segments; stale manifests in ad slots; ABR compliance issues in ad chunks; and audio and video quality issues within an ad stream.

The final step in the three-pronged monitoring approach is gaining enterprise visibility. FAST providers can achieve this by centrally managing multiple linear and OTT monitoring setups. Centralized monitoring enables service providers to track all ads being inserted at different locations of the workflow. It also gives service providers an overall picture of all ad markers coming in and out of their streams.

Interra Systems’ ORION Central Manager for reports and diagnostics

 With a centralized monitoring system, FAST providers can aggregate ad marker information from source to origin to show the complete trail of ad markers for a channel. The system reports issues with ad marker propagation while also helping to identify the stage at which point the ad information was missing, minimizing time for fault isolation and recovery. In addition, centralized monitoring computes ad KPIs such a number of ad events/hour, ad duration/hour, and average break duration to verify ad policy compliance.

 

Solving DAI Issues in the Real World

A leading U.S. broadcaster was receiving complaints that a change in network bandwidth and switching to a different profile or variant while an ad is playing was causing the ad to occasionally start all over again or skip entirely. At times, the ad would not play at all, and the end user would experience a black screen. Sometimes this would happen when an ad was playing because the source of the ad was not the same as the content.

The broadcaster needed to find the source of the problem and its root cause. Leveraging a real-time content monitoring and centralized solution, it could immediately see the exact stage and location where the ad insertion problem was detected.

Two issues were found. While the ad markers inserted at the source propagated correctly through the linear stage of the workflow, they somehow went missing at the origin server. This created an issue for end users because the player did not know when to play the ad. A second issue was detected at the origin server. Leveraging an OTT monitoring solution, the broadcaster could see that ad markers were misaligned across different profiles and variants. Due to this, ads were either skipped or played again on profile/variant switching at the end-user device. Clearly, the problem was occurring at the origin server and packager. The broadcaster gained insight into the root cause of the issues and subsequently reached out to the packager technology provider to resolve the issue.

Improving QoE for Fast Channels

FAST channel delivery is growing globally. Embracing targeted advertising is an opportunity for service providers in the FAST environment to increase viewer engagement and boost their revenue streams. By implementing a three-pronged approach that includes real-time content monitoring, OTT monitoring for ABR content, and centralized monitoring, FAST service providers can ensure the highest quality of experience for end users.

farmerswife – How to Create a Remote Work Schedule: Why You Really Need One

 

Carla Molina Whyte, Marketing Executive, farmerswife

When it comes to remote work, poor scheduling can have an even bigger impact on your life. Despite all the statistics about how remote work can increase productivity, working from home is still a new concept for most people and it can take a lot of time to adapt to this new way of working and also find a remote work schedule that works for you.

Having an effective remote work plan, with your tasks and meetings booked will keep you productive, and consistent throughout the day. Luckily, with all the scheduling technology we have today, making a work schedule isn’t that hard to do. Using tools such as farmerswife and Cirkus can have a great impact on the way you and your team will work!

Why is it important to stick to a remote work schedule?

Working on a schedule while working remotely it is crucial for achieving your goals. It will not only give you a clear structure of how you will break down your day, it will also allow you to focus on what needs to be done, rather that trying to do it all at once. Time is an important resource to a remote worker and you need to schedule all of your work well to make the most out of your working hours.

Scheduling can help you in many ways, including:

  • Ensure you have enough time for all the important and due tasks on your list
  • Understand what you can achieve realistically during your work hours
  • Have your days more organised and be more productive

Using tools such as farmerswife and Cirkus will allow you to have a clear picture on what you have to do. See below how a remote work schedule example can look like:

How to create a remote work schedule

There are many ways to create a work schedule, and it should be tailored according to each individual as not everyone’s way of working is the same. It’s crucial that you start to understand how you work in order to schedule your day accordingly. farmerswife can really help you stay productive and help your project managers manage team workload.

Here are a few steps that will help you create a general remote schedule that can be very useful for you:

  • Block your non – work hours
    The first step to creating your remote work schedule is to determine when you plan not to work, so you know exactly how much time you have every day to finish your tasks. Blocking non-work hours can also help you unplug more easily at the end of the workday and draw a clear boundary between your work and home life.
  • Add uninterrupted focus time in your schedule
    Focus time is a period during which you set aside to address priority tasks or activities you’ve delayed. Many employees with hybrid remote work schedules, set their focus time to the hour following lunch, when they’re readjusting to the structure required to complete their tasks. The idea is to block x amount of minutes of focus time in your calendar where you only focus on one task at a time with absolutely no other distractions.
  • Schedule meetings during your most unproductive times
    Remote employment allows for greater freedom. So you may plan your time according to your productivity. If you are exhausted soon after lunch, plan all your meetings for the afternoon. Nothing beats virtually meeting and catching up with your team!
  • Schedule breaks just like you schedule your important meetings
    Breaks are as crucial as focused hours if you want to be productive all day. Taking breaks might be challenging while working on something essential or facing a deadline; we all know that sometimes you just have to keep going throughout the day and taking breaks can help employees to retain their focus and energy. Working without breaks all day will lead to burnout and damage your productivity .
  • Create ground rules around your house 
    Working from home might be challenging when family members constantly interrupt or are around your work area. So that your productivity isn’t affected, you should have a clear conversation with them about what they can and cannot do during your work hours.
  • Don’t take on more tasks than you can handle
    When you are planning your to-do list it is easy to go overboard; you might end up taking on more tasks than you can possibly finish in a day and then you will start feeling guilty as you realise you can’t complete all the tasks you were supposed to do. To prevent this, you should keep more realistic goals for yourself and take on only as much work as you can handle.

Remote work is about flexibility and the right tools

Remote work is all about flexibility and you should be open to changing your schedule from time to time without risking productivity, but It’s important to structure your life to maintain an organized and efficient remote work schedule.

Using the best remote work tools to stay on top of your schedule can also make a lot of difference. It can be tempting to use all the new productivity and calendar tools you find online. However, the more tools you use, the more difficult it will become for you to manage all of your work data.

The last thing you want is to use so many apps that you end up wasting half of your time switching between them to find important data. Instead, you can use a centralized tool like farmerswife, which enables you to organize and track project resources, plan and control the project lifecycle, manage day-to-day tasks, create customized budgets and analyze financial performance in a practical way that drives better decisions. And with the new Cirkus interface, you will be able to collaborate with comments, files and tasks in your projects.

Remote work productivity tips

For remote workers it can be challenging to stay productive when juggling the demands of your daily job alongside home concerns. Here are five tips to help you stay on top of your tasks and meet your deadlines :

  • Create a people-first culture
    People-first cultures value people over profits; when employees are valued as whole individuals and provided the opportunity for well-being, connection, and fullfillment, companies are generally more innovative, resilient, and even profitable!
    Studies show that culture-driven organizations experience 26 percent fewer mistakes, 22 percent higher productivity, 41 percent lower absenteeism, and 30 percent stronger customer satisfaction.
  • Prioritize your work
    Learn how to separate the most urgent priorities from other important tasks. Use a tool like Cirkus, which will help you list out your objectives according to their urgency so you can stay on top of deadlines and remain productive, even if your workload increases. In Cirkus you are able to pick the priority of each project, which will give a great overview of the most important tasks over the less important ones.
  • Reduce meetings to the barest minimum
    It’s crucial to understand how these meetings impact the team’s motivation and morale. With this knowledge, we’ll be able to eliminate unnecessary meetings that halt productivity and give back the time that the team needs to perform effectively.
  • Work- life balance matters
    It is important that employees have a good work-life balance. Not only does it improve their mental health, but it benefits the business too. Employees with a good work-life balance are more efficient, productive and motivated. Without a daily routine, there are high chances that tasks will be forgotten, jobs rushed, and working longer and longer hours just to keep up. That’s why daily routines are essential for remote work productivity.
  • Be firm with distractions
    Distractions can seem impossible to avoid. Start by building habits that help you eliminate distractions and stay focused. Focus on creating an environment in which you’re less tempted to get distracted with something other than what you’re working on. Work to create habits that signal to yourself and those around you that you’re in distraction-free mode.

How to choose the best project management tool

It’s challenging to find the best project management tool that meets all of your needs. Organizations need a project management tool that is designed to help project teams to plan, track & manage the projects and help achieve the defined goals within the time settled. It should also help the team members to collaborate effectively.

Using a tool like farmerswife is the best way to be productive and track progress within teams.  With the best project management software in place, your team will have a higher quality of workflow and will work more efficiently.

Conclusion

Following all the tips above to create a remote work schedule should be your main goal before starting to work. A remote work plan will help you be more productive, consistent, and meet your daily goals. farmerswife can really help your team organize and keep their schedule clear and easy to follow and also project managers will be able to track all the movements easily.

Interested in learning more?

Book your free demo here – https://blog.farmerswife.com/how-to-create-a-remote-work-schedule

 

 

Codemill – Moving Towards a Connected Media Model

Johan Bergström, Head of Sales and Marketing, Codemill

As the media industry continues to navigate new consumer trends and monetise increasing volumes of content, there is a growing desire for platforms that provide complete asset management and operational efficiency. As organisations weigh up their options in moving towards a cloud-based future, MAM solutions need to evolve to meet the demand for more remote and hybrid workflows.

A combination of MAM, content processing tools, and supply chain systems, should work together to streamline complex media workflows. By embracing an integrated approach, entertainment providers can successfully manage content at scale.

The MAM Legacy

In traditional on-prem media workflows, MAM systems lack agility, and they often face over-provisioning. Legacy MAM design centres on providing extra capacity for editing and content processing, but this is constrained by the parameters of physical storage. This approach contradicts the flexible usage model embedded in the cloud, where resources are available dynamically and on-demand.

Traditional MAMs also encounter various challenges such as inadequate data lifecycle management, insufficient version control, and fragmented storage. When managing a large number of assets, these problems can escalate quickly. If you’re not keeping on top of lifecycle management, it can lead to old, unarchived content, affecting storage requirements or complicating asset search and retrieval. Incorrect versioning during editing and localisation processes results in file duplication, further exacerbating the search problem.

In large media organisations, content can easily become siloed across different departments. When assets are siloed, the biggest challenge is duplication of effort. Separate teams could end up working on exactly the same content and waste valuable resources doing it. This waste is then transferred to the MAM in a rinse-and-repeat cycle. If this process is multiplied across thousands of assets, over several years, then storage capacity is going to become strained.

A Hybrid Approach

Traditional MAMs are typically inflexible because the asset hierarchy and organisational structure is not easily modified. Workflows become rigidly defined and this has inherent challenges. When processes don’t fit requirements, teams can start to look for creative workarounds. Outdated systems cause frustration for users and open up content archives to security risks. Inflexibility can hinder any efforts to improve redundant content processing methods and put the brakes on productivity. Integrating new tools or implementing changes becomes increasingly challenging due to workflows that are set in stone.

While this rigidity was less problematic in previous decades, the ability to work flexibly and innovatively is now essential for modern media processing. But for many large media companies, a fully cloud-based approach is not an option. The storage costs for nearline access requirements and the unpredictable egress fees make it an unrealistic strategy. So with media teams working remotely all over the world, how can entertainment providers find more flexible ways to collaborate on content?

Managing Pain Points

Entertainment providers and broadcasters have extensive content archives consisting of valuable assets. The combination of diverse infrastructure requirements and intricate workflows, mean there are a lot of variables to contend with. Organisations need highly customisable and configurable workflows to increase content processing efficiency. While operators and editors expect a streamlined process to search, filter, and work with media content.

A MAM system should optimise asset search through appropriate metadata, but it should also enhance workflow automation and seamlessly integrate with post-production tools. Entertainment providers and broadcasters require a comprehensive set of solutions capable of handling content at scale. The specific combination of solutions will vary based on individual company requirements and downstream actions. Regardless of the specific needs, media teams should have the ability to seamlessly integrate their preferred tools and infrastructure, either through custom development or REST APIs. Fortunately, a hybrid-cloud system allows organisations to achieve this.

Connecting Content

To ensure that the entire chain is working effectively, all stages from ingest, to media and metadata management, right through to content distribution should be joined together. Media companies need more effective ways to connect islands of content and integrate essential media processing tools. To achieve this, a system must be able to utilise metadata to its full potential. Metadata should be stored within the system backend and enrich the description of assets to facilitate more accurate search functionality.

As no two workflows are the same, systems must be adaptable and offer extensive customisation options. If users have the option to build collections rather than treat assets like separate islands, then new functionality is possible. Admins can assign access to users and groups, defining the view, edit, and export rights for them at a granular or macro level. Flexible systems mean they can individually update, or batch edit metadata within a collection. Advanced archiving rules allow assets to be automatically offloaded from different storage tiers and managed in one central location. The end result, is that users can breathe a collective sigh of relief.

The Next Wave

The industry’s approach to technology solutions is converging from single lanes of traffic to a consolidated approach. Vendors now recognize that they can achieve more by collaborating and integrating their offerings. Rather than releasing single, end-to-end solutions that lock media companies into a restrictive framework, it’s clear we need flexible workflows to face the future of media consumption.

With no two workflows the same, it is more important than ever that adaptable solutions exist to give entertainment providers the control they need. Solutions are becoming less of an “either or” choice and more of a “yes and” adaptable solution. A non-proprietary approach with comprehensive APIs links a wide range of possibilities into the existing infrastructure. It’s time for media companies to expand their horizons on what a modern MAM can offer.