Studio Network Solutions – Building Business Agility Into Your Media Production Workflow

Melanie Ciotti, 

Studio Network Solutions


Business agility has a whole new meaning these days.

I remember back in business school, professors would talk about “business agility” like it was a badge of honor reserved for the best, most innovative companies. Now, it’s a survival tactic—a flat-out necessity.

Why? For starters, COVID-conspired supply chain disruptions have wreaked havoc on industries from baby formula to semiconductors—and ours isn’t immune. But business agility runs deeper than supply chain management.

Consumers are growing increasingly less patient with inefficiency in their lives. If your product or service isn’t constantly adapting to and anticipating their needs, there’s no time for strikes—you’re just out.

So, how can media teams stay agile and adjust to current and future trends in real time? It starts with an agile production workflow.

An agile production workflow helps media teams respond quickly to new opportunities and mitigate threats

The Agile Production Workflow

Constantly uploading and downloading files off the server, passing hard drives from editor to editor, wasting creative resources on non-creative tasks—these are some of the obstacles that hold production companies back from business agility nirvana.

At SNS, we build solutions that enhance the post-production workflow so that media teams can create amazing content, faster, from anywhere. Here are some of the ways our solutions can help you build an agile production workflow for your media team.

Standardizing On-Prem And Remote Workflows Let’s not dwell on the events of 2020 for too long, but the pandemic spelled out in capital letters just how agile our industry was at the time. (If you read that as a negative or positive, you know which side of the coin you were on.) Standardizing your video production workflow for on-prem and remote collaborators minimizes the burden on editors in a hybrid environment. And that doesn’t mean compromising with egregious egress fees and latency issues.

The high-performance EVO shared storage solution includes several remote editing and cloud workflow tools to help creative teams find the optimal balance between high-speed online editing and flexible remote connectivity.

EVO’s remote cloud workflow solutions help media teams finish projects faster, from anywhere.

Creative collaboration powered by EVO blurs the line between your on-premise and at-home workflow. When connected to EVO remotely, your post-production team gains access not only to the media stored on the server, but also to the award-winning EVO Suite of software tools included: ShareBrowser for media asset management (MAM), Slingshot for workflow automations, and Nomad for remote editing.

EVO makes remote access easy with SNS Cloud VPN—the secure, convenient, and ultra-fast virtual private network (VPN) service exclusively for EVO. Setup in minutes and 2x-5x faster than traditional VPNs, this cloud solution gives creative teams access to everything they need for their remote video editing projects just as if they were in the studio.

With EVO’s remote/cloud workflow tools enabled, your on-prem and remote collaborators can:

● Search, find, and preview media in ShareBrowser

● Import media and metadata into their favorite NLEs

● Backup files to network-attached and cloud storage

● Automate file transfers, transcodes, and other data processing tasks

● Download edit-ready proxies for remote editorial

● Do all of the above, from anywhere

Automating Your Workflow

Content creators should spend the bulk of their time creating content. Unfortunately, non-creative tasks have a way of weighing down production workflows. Endlessly wading through files, manually transcoding media, and tying up workstations to run cloud backups translates to workflow inefficiency and low team morale.

 Agile organizations don’t dedicate editors and workstations to data processing tasks. They automate these elements of their workflow, keeping creative resources in their creative mindset as much as possible. This is where Slingshot comes in.

Slingshot is EVO’s built-in automation engine and the soon-to-be most productive member of your production team. By automating your file transfers, media backups, transcode jobs, and more, Slingshot takes the busy work out of your workflow and off your team’s to-do list

ShareBrowser MAM features an AI connector for automatic metadata generation.

Artificial intelligence and machine learning (AI/ML) are fantastic agility boosters, especially when seamlessly integrated into your workflow. ShareBrowser’s AI connector, for example, can automatically add relevant tags to your footage with the click of a button, making it easier for team members to find the clips they need without spending their time logging those tags themselves.

Being open to new technologies that save time in your workflow enhances your team’s ability to react swiftly to emerging opportunities, trends, and threats. It’s the ultimate catalyst for an agile production workflow.

How Workflow Agility Breeds Opportunity

A truly agile workflow helps production companies identify, execute, and even monetize new business opportunities.

For example, Barbershop Films uses EVO and ShareBrowser as a client-facing media portal, adding more value to their footage beyond the final deliverable.

Barbershop Films’ post-production workflow is powered by EVO shared storage.

Jeremy Drummond, president at Barbershop Films, explained: “You spend $300,000 to shoot a 2-day commercial campaign, and that footage is great. It’s done, it’s got its 13-week run, it’s measurable. We did everything we’re supposed to. But now with EVO, we’ve unlocked all this value in those shots that didn’t make the cut—in the additional angles that are great in a social piece, or powerpoint, or a pitch deck. We can leverage that and offer it to our enterprise clients as additional value for their investment.”

When you remove the physical walls constraining where creative professionals can work, stop wasting creative resources on non-creative tasks, and invest in technology that breeds new opportunities for your team, you’ve mastered the agile production workflow.

Learn more about EVO shared storage and the included EVO Suite of workflow tools at snsevo.com

Touchstream – Live Streaming Workflows for OTT: Guide to Moving to the Cloud

Brenton Ough,


CEO and Co-Founder


Cloud technology. Cloud computing. Devices connected to digital storage in data center via internet. IOT. Smart home. Communication laptop, tablet, phone and domestic devices with online database.

As the staggering growth of streaming viewership and consumption slows as indicated by Netflix’s meltdown at the start of 2022, and HBO's (Crash of the) House of Dragons Premier in August 2022, streaming operators face one major challenge: ensuring streaming workflows scale both up and down. At the center is flexibility to cope with changing viewer demand patterns, who expect high QoE and are well versed in cancelling subscriptions and switching to a competitor. Moving to the cloud is a popular and excellent solution, but what are the true benefits? Should you move all components to the cloud? How do you transition efficiently without jeopardising QoE?

Current state of live streaming workflows

A streaming workflow seems simple: capture the video, compress, package (i.e. CMAF, HLS, DASH), ingest and transcode, and deliver it to viewers. In reality, we know it’s much more complex than that.

Viewers demand high-quality streams with low latency and no buffering, and they’re not forgiving if you fail to deliver. One error anywhere in your streaming workflow negatively impacts QoS, which translates to churn and ultimately loss of revenue. Further complicating it are different technologies, a lack of standardisation leading (data fragmentation), and, everything keeps evolving and changing. Streaming operators struggle for competitive advantages, and one clear trend emerges: moving live streaming workflows to the cloud.

Benefits of moving your live streaming workflow to the cloud

The current approach to streaming monitoring isn’t flexible enough to track both hardware and software components; monitoring technologies need to be cloud-based as well.

Virtualised versions of technologies, like encoders, allow you to shift costs to operational expenditures, and now only pay for a server as it’s needed.

This contrasts with the industry’s past heavy reliance on capital expenditure - acquiring physical encoders, servers, etc (and maintaining them) - when video delivery was by terrestrial lines or through physical media. The elasticity provided by the cloud unlocks these benefits:

  • Better scalability
  • Service reliability for viewers
  • Greatly reduced maintenance
  • Updates at a lower cost
  • Get to market much faster: critical for competition

Should you move your streaming workflow to the cloud?

Everything can be virtualised, but sometimes it makes sense to keep elements of the streaming workflow behind the corporate firewall.

Sometimes, it might make more sense to have initial encoding for a live stream happen on physical machines that can be more closely managed and aren’t subject to the potential latency of cloud resources. The workflow might look something like a rack of encoders producing a master stream, then sending it to cloud-based encoding resources for transcoding and repackaging.

How to decide what streaming workflow components to migrate

No streaming operator is equal; here is a step-by-step guide to how to assess which components to move, how, and when.

1. Assess where you currently stand

Some companies are more advanced than others; maybe you just learned to adapt the cloud better to your streaming needs, or maybe you're already employing the latest cloud technology to leverage its full potential. Understanding where you stand in cloud adoption and usage right now is key to planning.

Cloud KPIs: adoption vs. usage

Understanding to what extent your organisation is “cloudified” is important for two reasons. First, it’s strategic, because it helps you find how you can use cloud technology to improve the efficiency, resiliency, and performance of your streaming service long-term. Second, it’s tactical, because it lets you identify how you can employ cloud technologies right now to improve your engineering efforts.

For example, moving from specific hardware, such as encoders, to virtualised instances can expose APIs which provide programmatic control over encoding functionality, ensuring improved scalability, resilience, and lower cost. This can result in better-engineered software.

Adoption matters too. The decision to migrate from server-based to cloud-based technologies is a strategic one, but it doesn’t dictate the way the cloud is employed in engineering efforts. It could be virtualisation or it could be serverless functions.

Also consider usage. Even if the strategic decision hasn’t been made to migrate to the cloud long-term, individual cloud technologies can be employed immediately to solve specific challenges or gain efficiencies within the workflow. Measuring how well you adopt and use cloud technologies will give you a clear picture of your immediate and long-term opportunities.

Measuring cloud adoption within streaming workflows

Part of measuring adoption is technology selection. You could be virtualising components of the workflow or you could be embedding them within the very fabric of the cloud through serverless functions. They are both cloud technologies. To help you measure your adoption of cloud technologies, consider the following scale:

Measuring cloud usage within streaming workflows

Measuring usage is similar to measuring adoption. There are lots of ways you can utilise cloud technologies as part of the workflow – even if your adoption of the cloud overall is relatively low.

Using your scores to drive change

Combining the results of this subjective assessment with data from your streaming service, quantifies how cloud technology adoption and usage could impact your subscriber growth, user engagement, attrition, advertising revenue, and more.

If you see QoE and engagement data drop as simultaneous users increase, then by transitioning workflow components from hardware to cloud, from virtualised to serverless, or from traditional to microservice architecture, you can improve those metrics.

Assessing cloud technologies

Before applying cloud adoption to your strategy or implementing cloud technologies into your workflow, it helps to assess the landscape. The best way to do that is a tech radar.

A tech radar helps you bucket technologies into categories so your engineering teams aren’t wasting time figuring out which technology to consider. It accounts for the current state of the technology within the market, and provides clear guidance across your entire organisation.

Imagine a tech radar for video stream monitoring. An AI-based approach might be in the Assess layer (because it’s still not proven and there’s a lot of iteration within the technology) while a microservice-based approach, such as Touchstream, might be in the Adopt phase. You can also utilise Architecture Decision Records (ADRs) to capture and document decisions so development teams and individual engineers understand why something was chosen.

The key to successful adoption and usage

Any streaming operator can adopt cloud technologies in their stack, but making those technologies part of your streaming development efforts means everyone must be on the same page, with collaboration enabled by Tech Radars and ADRs. Moving in the same direction about developing within the cloud and with cloud technologies sees adoption and usage scores improve–and the success of your platform, too.

2. Check if migrating a component makes sense

The migration of streaming video components to the cloud, and from one cloud technology (such as virtualisation) to another (like serverless functions), is a natural evolution of OTT streaming architectures. The architecture needs to be able to grow efficiently and effectively based on audience demand, but it may not make sense for a component to be virtualised, turned into a microservice, or even made into a serverless edge function.

The first step is to determine the operational benefit of migrating the component. Will it have a meaningful impact on key metrics such as video startup times, rebuffer ratio, and bitrate changes? Will transitioning the component make it easier to support? If the answer is “yes” to both questions, then it makes sense to migrate.

3. Ensure you can still monitor it

The second step can complicate things: determine how to monitor the new version. When the migration is from hardware to software, or from software to cloud, significant challenges arise and could involve an entirely new approach (eg, replacing hardware probes with software versions; a type of transition in and of itself). Having a monitoring harness in place makes things much easier as the new version can be programmatically connected to the harness, enabling operations to continue using existing dashboards and visualisations. Without a harness, understanding the monitoring implications of the technology transition is critical to continuing migration. Not having a way to integrate the new version into existing monitoring systems will make it more difficult to achieve observability. Once you’ve identified what workflow components to move to the cloud, the next step is to plan and determine how to execute the transmission.

How to Move Your Live Streaming Workflow to the Cloud

Read the expanded version of this article, including an entire 1800 word section on How to Move Your Live Streaming Workflow to the Cloud on Touchstream's Blog.

To find out how to scale your monitoring operations with Touchstream’s VirtualNOC, download our Monitoring Harness White Paper now.

Latakoo – From the Sea to the Broadcast Center, Lessons in Agility

Jade Kurian, 

Iatakoo co-founder


Running a production department at a major news network can be a lot like sailing a ship. Both can push leaders to sink or swim. A seaman turned news executive knows what it takes to keep things running smoothly even when the seas turn rough and unpredictable.

Applying skills learned on capricious waters as a young French Merchant Marine, Laurent De Rodez, Euronews’ Head of Production & Post Production, has spent 20 years at the company running nimble operations, building strong crews and anticipating future needs to stay ahead of the curve and his competition.

Laurent De Rodez leads Euronews production and post-production and has worked at the network for 20 years.

When he decided to move on from the naval service, De Rodez found the career that continues to keep him pivoting and moving fast. He began taking classes in video production, and eventually started teaching those skills to others. Then, came ‘the business’. “I started in the TV business. And then I worked my way up to some responsibility in different companies,” De Rodez said. Spending time as a post-production manager, he was able to transfer those skills into television news, first aiding video editing teams and later moving into the job he has today.

Euronews is Europe’s leading international news channel, serving an audience of 400 million homes in 160 countries and available in 17 languages. With De Rodez’ leadership in production and technology, the network has been able to navigate worldwide crises and uncertainties with the help of hundreds of freelance journalists who collaborate around the clock with the Euronews team members.

With more than 400 journalists spread across 30 different countries, Euronews relies heavily on strong remote capabilities, collaboration over borders and time zones and seamless production tools.

​​​​​​Euronews has a massive international imprint, serving 160 countries in 17 languages.

In 2017, NBC Universal sold its minority stake in Euronews to the majority owner, Media Globe Networks. That was an opportune time for reinvention. The Euronews team carefully reconfigured its approach to broadcast journalism, but maintained its core operations. One of the most important areas for Euronews, though not fully appreciated at the time, was its remote capabilities. Long before many others in the business were harnessing technologies to make it possible, Euronews was making remote work part of their day-to-day workflow.

Accurately anticipating needs in the ever-changing news business has been a trademark for De Rodez and Euronews. Though they couldn’t have predicted the pandemic, their early reliance on freelance journalists across Europe helped them through a critical juncture. De Rodez says the strategy paid off. In the early days of the pandemic when news teams around the globe were struggling to figure out how to produce news outside of their traditional newsroom environments, Euronews was already comfortable with the tools they had been using for a few years. One of the most reliable resources they had, according to De Rodez, is latakoo.

latakoo provides a seamless workflow for journalists and efficiencies for news agencies. Getting news from the field and on air is the way freelancers get paid. Making the process as quick and efficient as possible is crucial to their work supporting a round-the-clock news cycle. Created by two former news reporters (Paul Adrian and myself) a little over a decade ago, latakoo provides simplicity to a historically complicated process. We saw a gap that prevented reporters from being more efficient in their work and radically improved the process. “latakoo was like a miracle. There is a cloud-based storage where everybody can send content, share it and organize it. The files are available immediately after the journalist sends from locations around the world, and they also show up in our asset management system,” De Rodez said. With latakoo, journalists can quickly upload their footage in the field from their laptop and smartphones. Within seconds, the entire Euronews team has access to the files, and can immediately share the content on-air and online.

“Euronews is a special type of company that hires hundreds of freelancers. So, on any given day they may have someone producing content who never previously worked for them,” said Paul Adrian, CEO and my co-founder at latakoo. “latakoo makes it super easy for freelancers to sign up for Euronews systems and to use their own technology to collect, shoot, edit, deliver and collaborate with their employer. latakoo meets these requirements.”

  • Euronews embraced a dependence on freelance content long before the pandemic.
  • latakoo also offers an easy-to-use, but technologically robust mobile app for iOS and Android. Along with the LiveU app, LU-Smart, journalists who are working alone have the ability to create stories in the field with latakoo and go live from the field using LiveU. LiveU and latakoo are also integrated. That means a file can be sent using LiveU’s store and forward and land first in the latakoo cloud and end up in a user’s asset manager. According to De Rodez, his team pulls about 10 live shots per day from correspondents using the LiveU-Smart.

    Today, Euronews is in good company. More reporters are working remotely than ever with a significant number of them increasingly being freelancers. Career website Zippia estimates there are more than 14,000 freelance journalists in the United States alone. The International Federation of Journalists says freelance journalism is no longer an “atypical” form of work. In some countries, the majority of journalists are freelancers.

    Luke Hanrahan joined Euronews as a freelancer in 2019. Attracted to the network because of its huge footprint and reputation in the industry, he has continued to work for the company because of Euronews’ technological agility and prowess. “Using automation tools like latakoo puts Euronews in a good light for freelance reporters,” said Hanrahan. Hanrahan describes the latakoo experience in three parts: efficiency, intelligence and speed.

    Efficiency:

    “Every time I submit a package to Euronews as a journalist in the field, it automatically lands on their asset manager. Out of the news networks I have worked for, when you upload a file, whether that is a package or interview, often there is a person who needs to put that file where it needs to be. And that still happens at some major networks. Euronews has the ability to automatically put the file in the folder that it needs to be in for it to go on air. So, within a second of my package landing in the Euronews base in Lyon, it can be pushed on air. So, latakoo gives you the flexibility to do remote work and feel connected to the teams because they are able to access your material much quicker.”

    Intelligence:

    “latakoo itself is an intelligent tool. If you’re on a breaking story, you can upload material while you continue filming on your phone. So, say for example, there has been a major incident in central London and you just happen to be standing there at the time the major incident occurs, you can stop, pause and carry on. I can’t think of another kit that allows you to do that – to be able to film while you’re uploading. That is pretty intelligent.”

    Speed:

    “Because of the technology that latakoo has within it, which is pretty unique, you can upload on one bar. Like when I was out at sea last year. I had one bar, and occasionally no bars and then occasionally one bar. You can drop in and out-of-signal and you are not having to restart the send as you would with pretty much any other technology including like the Live-U’s. I could do that safe in the knowledge that I could leave it all uploading in the background as I continued with my work. By the time I got back to the shore, what I needed to upload had already been uploaded. I hadn’t had to monitor it at all and that is pretty intelligent.”

  • A freelance correspondent for international networks, including Euronews, Luke Hanrahan covered the deadly flooding in Western Germany in July, 2021.
  • De Rodez shares Hanrahan’s excitement for intelligent technology. He and his network have weathered the pandemic and other storms by embracing uncertainty and relying on just that kind of technology and capable crew members, including freelancers like Hanrahan, who will continue to supply a significant amount of content to fill their 24/7 news cycle. Having put in motion a system that uses technology and people that can quickly shift and change based on the currents, De Rodez and Euronews can keep sailing through fair or fierce winds.

    The Switch – Pushing the boundaries: Evolving live event media rights into the streaming era

    Robert Szabo-Rowe

     

    SVP Engineering and Product Management


    As the live events landscape heads full-throttle towards social media, streaming, and video-on-demand (VOD), rightsholders urgently need to find greater value from their media assets beyond the realms of traditional, live broadcast TV.

    The rise of direct-to-consumer streaming platforms and the popularity of on-demand replays of major live events online via catch-up services and social media has made for a far richer content landscape. As such, traditional broadcast services must accommodate each consumer’s streaming preference.

    Despite being among the most expensive media assets to acquire, live content ironically has traditionally had the shortest shelf life when compared to films and TV shows. This situation begs questions for broadcasters and content owners paying hundreds of millions or even billions for a sports season, esports league, or entertainment event – where does the true value of that content lie, and are they getting the most out of their rights acquisitions?

    Today’s comprehensive live TV environment spans pre- and post-game shows, red carpet interviews, studio interviews with celebrities and talent, and new creative forms of shoulder programming, creating a content continuum that extends far beyond the 90-minute Premier League football match, a 48-minute NBA game or two-hour awards ceremony. This content continuum of live and on-demand elements creates exciting opportunities for rightsholders to experiment with their assets and deliver engaging and relevant programming to a range of target audiences.

    From live to on-demand

    The exodus of traditional broadcast viewers toward digital platforms has been widely published in analyst reports. Deloitte Global, for example, expects 2022 to be the last year traditional broadcasters retain the majority share of viewing hours in the UK, with other global markets expected to follow suit quickly.

    The emphasis on linear coverage and the acquisition of rights on the basis that the live broadcast is where the true value lies has to be seriously rethought as viewing habits continue to evolve. Thankfully, advances in cloud-based technology for live production, clipping and editing, streaming, and video processing is making the transition towards rapid on-demand highlights creation on social media and streaming services easier than ever.

    Broadcasters and rightsholders now have the tools and resources needed to seriously re-evaluate how they approach the acquisition of rights. The value of primetime live programming will always persist – particularly for high-value content such as the NFL football, The Oscars, Major League Baseball and top-tier European football. But the burgeoning streaming market calls for siphoning some live broadcast rights to a whole host of new content platforms.

    Finding a flexible approach to live rights acquisition

    Separating live broadcast rights between social media and streaming platforms could create an engaging, tiered offering, maximising the value of the ‘live’ content with a VOD platform allowing for streamed and catch-up views. We’re already seeing this trend emerge with platforms such as Apple TV+ and Amazon Prime, which have recently acquired the rights to MLB and NFL, respectively. The content owner’s D2C app could show clips in real-time, with streaming and social media apps offering an on-demand broadcast.

    Near-live clips are where much of the value of live sports may reside moving forward and the faster content owners and rightsholders realise this, the quicker they can start monetising the process and growing their viewership. Media companies are already generating revenues from these applications through sponsorship, advertising and subscriptions.

    Accessible solutions for the whole industry

    While the development of the content continuum is shaking up the rights picture for major national and global sports leagues and top-tier entertainment event organisers, it is likewise impacting regional and niche events where, generally, the only content available is often from fans videoing and posting content themselves.

    The fast-growing capabilities of cloud technology mean sports federations, teams, esports leagues, and other events organisers of all sizes and types can create their own broadcast-quality content at a fraction of the cost of traditional TV production. They can create a gateway through which fans can consume either live streams or VOD assets, as well as highlights through an app or a social channel.

    Consumers now want more control than ever over the live content that’s available to them for all the content they view, and rightsholders of all sizes must evolve their business models to accommodate this change. Doing so also protects the future of this content by cultivating new audiences, reaching more platforms, and ensuring content remains relevant and engaging for the modern audience and younger generations.

    This is not to say the media industry should wave goodbye to the live broadcast – far from it. But rightsholders must recognise we’ve gone past the point of no return when it comes to the popularity of on-demand highlights, clips, social media posts and game/match replays across time zones. The live event is simply the centrepiece of an entire content continuum of live and on-demand elements. By recognising how their market is evolving, leagues and rightsholders can open up new revenue streams, engage fans in new ways, and ensure their content remains relevant for the digital future.

    Fabric – Metadata Management Platform – Build or Buy?

    Andrew Holland

    Director of Data Services


    ‘We can build that ourselves’

    This is a common refrain from IT departments, whenever SaaS products such as Fabric are offered for consideration. It’s certainly possible that your in-house IT team could build you a functioning bespoke metadata management platform - but should they?

    Is it worth delaying your roll-out to allow for the build? Is it worth the investment and the risk? How do the ongoing maintenance costs compare against the costs of a SaaS subscription? There are a host of considerations to evaluate, that boil down to one common question - should you ‘Build or Buy’?

    In this article we’ll explore some of the factors.

    Risk. Will it work? There is no guarantee that your IT team will be able to build all of the functionality you require, let alone deliver it on time and within budget. There is a lot of risk involved. Plug and play toolsets like Figma can give the unrealistic view that it is possible to prototype and build fully functioning new applications quickly - but there is a vast distance between a mock-up and a fully fledged application - made up of time, effort, risk and expense

    Delay. How long will it take to build your own platform? You could roll out Fabric now with rapid seamless integrations and powerful tools to deduplicate and clean up your catalog while you migrate your content metadata. Can you afford to wait?

    Cost. How much will it cost to build? Developers are notorious for underestimating because prototypes can happen quickly.  The reality is that to build out a true enterprise application with all the right APIs, user roles and security, information security requirements, takes a lot longer than anyone realizes on first pass.

    Recurring cost = Subscription without benefits. There is often a false impression that building your own system requires a single, up-front investment, as opposed to the ongoing payments of a subscription model - essentially the ‘Cap-ex self-build delusion’. This is mistaken. Any self-built system will require ongoing updates, troubleshooting, API development, pen-testing, and maintenance. All of this work creates an ongoing cost - that would be better deployed on a guaranteed certified product. The general consensus is that the annual cost of maintaining a custom self-built system is 20% of the overall build cost - effectively a subscription without any of the benefits. https://westarete.com/insights/maintenance-costs-for-custom-software/

    Capex vs Opex. Strategic technology decisions should really be separated from the political minutiae of which budget to use. A decent SaaS provider will be able to offer you an up-front cost that can be covered from a Capex budget, or an ongoing subscription fee that can be paid from an Opex budget. The focus should be on how much value is being derived from the overall expenditure.

    Expertise. In-house IT departments will insist that no-one knows your business as well as they do. No one but them could make a custom service so perfectly suited to your business’ particular needs. But no-one knows OUR business as well as we do. Our business is Metadata management. We’re the market-leading experts. Your in-house IT department has a million tasks to contend with. They are never going to create a system that rivals what Fabric has to offer.

    Technical Debt. API keys are updated and changed. MAM systems are regularly updated. Business acquisitions and mergers require new systems integrations. As your in-house app evolves to respond to these updates it will accumulate fixes, patches and customizations that will require ever-increasing attention. Instead of a robust and widely used platform that has specialist support teams available - you will become dependent on a small silo of individuals in your IT department who understand the idiosyncrasies of your system.

    Sole Focus. Our only focus is to deliver a perfectly integrated and elegantly functioning metadata management platform. We have a track record of delivering catalog migrations, deduplications and systems integrations in record time - with long experience of the common problems faced by numerous major organizations. This expertise serves all parties well. In comparison, in house custom-built application projects can lose impetus as teams find ways to deprioritize in favor of day to day business.

    Added Functionality. When a client requests a new feature from Fabric - we build it - then we make it available to all of our customers. We have added, improved and refined dozens of new features to our main platform, meaning that all of our customers benefit from upgraded service. Many of these features require considerable development investments that would not be cost effective if built in-house.

    Unified Front End. Our seamless 3rd party integrations across your supply chain surface key data attributes (like avails from rights, or asset details from a MAM), giving unparalleled insights, helping to get ahead of deliveries.

    Clearly, in the current market, buying a dedicated subscription SaaS product such as Fabric has the advantage over building an in-house solution. Why try to build an expensive solution that is outside your team’s direct area of expertise, when an excellent option is already available for immediate deployment? Ultimately, every business will have different requirements, and different degrees of in-house IT capability, and it is up to business leaders to identify what is best for their organization. Make sure you make the best choice - and do what’s right for your business.

    Fabric’s pioneering platform is in use by some of the world’s most prestigious studios, broadcasters and distributors. Find out why at www.fabricdata.com.

    CGI – Why Business Agility and Resilience are ‘Must-Haves’ for Today’s Digital Leaders

    Mark Hasselmann,

    Product Manager OpenMedia portfolio


    Business agility has become one of the most talked about organizational capabilities of recent years. Indeed, the ability to embrace new ways of thinking and reinvent businesses at pace and scale has become essential to success. Research suggests, for example, that the most agile companies outperform their competitors in terms of revenue and profitable growth by a factor of 2-3 times. The bottom line is there can be no doubt that agile business practices drive results across the organization.

    But what does ‘agile’ represent? The Agile Business Consortium, for example, points towards “agility in an organization’s culture, leadership, strategy, and governance that adds value to all stakeholders who operate in uncertain, complex, and ambiguous environments.”

    In practical terms, agile businesses can respond to external and internal opportunities and risks quickly, adapting faster and remaining customer-focused – key qualities in today’s fast-paced digital economy.

    Fortunately, compared to just a few years ago, more businesses have direct experience of the role, challenges, and benefits of an agile model than ever before. Given the COVID-19 pandemic, this is to be expected – organizations had to move quickly, focusing on core business activities, collaborating online, and innovating to solve problems they’d never experienced before. Among the varied benefits that came as a result, many organizations achieved record technology and process implementation times as they rushed to deploy new solutions to keep their businesses alive. At CGI we want to be able to detect trends and realize those for our clients. We are setting up lean startup workflows to kick off development fast and constantly monitor feedback on our innovations. Misleading targets can be corrected with shorter reaction times and priorities can be adjusted due to high transparency within our organization and the well-cultivated alignment between Management and Development Teams.

    In addition to this acceleration in decision-making and the assurance of maximum transparency to support alignment and collaboration across all levels and departments, there have also been significant advances in leadership. The rebuild of governance around new organizational structures and a cultural change support a growing trust in people - from all directions, as management and staff have to adopt new ways of working.
     

    As part of our transition after the merge of OpenMedia into CGI, we invested a lot of effort to train and coach our members in new ways of working. We re-shaped our agile processes, increased transparency, and pushed interoperability of our teams in all areas (from management, over implementation, and project teams to support and development teams). This complete vertical and horizontal consistency assures the understanding of our product vision and provides possibilities to decentralize decisions – as we are all aiming at the same big picture and understand the business needs. Constant offerings for training, internal working groups, and communities of special topics ensure a dynamic view of necessities and possibilities. New collaboration ideas and growing alignment lead to a new culture and mindset within an organization. And this in return supports the intrinsic motivation of our colleagues.

    Agility fuels resilience

    Having experienced first-hand the benefits of business agility, and, in some cases, its crucial role in adapting to rapid, existential challenges, many successful organizations also view agility through an additional lens: how to become more resilient.

    In the current challenging economic climate, this is even more relevant. Armed with the individual and collective lessons learned from the pandemic, for example, organizations are now focusing on sustaining progress and becoming more resilient. In doing so, they should focus on three key priorities: energy, security, and development.

    In particular, drive, positivity, and energy are required to fuel and maintain the changes required to initiate and sustain business agility. This requires that organizations operate in a safe and secure ecosystem that provides a foundation for ongoing transformation, enabling individuals and teams to build the capabilities that allow them to thrive.

    In doing so, they can address a wide variety of organizational objectives, from transforming for net zero, and delivering on environmental, social, and corporate governance (ESG) goals to redesigning value chains and shaping the future of work, among many others.

    Clearly, technology and data play an important role as commercial and government organizations continue to focus on improving their business and operating models. Recent research has revealed that organizations have modernized 37% of their applications, with 72% of this implemented on a cloud foundation. What’s more, they forecast that 72% of their applications will be modernized within the next 2-3 years.

    This accelerating pace of tech infrastructure modernization is crucial, not least because it helps organizations deliver a faster time to value. The good news is that the burgeoning SaaS solutions industry will only accelerate the process even further, giving businesses the ability to quickly adopt new technology services without the need for significant capital outlay.

    Agile digital leaders

    In addition, today’s agile digital leaders understand the increasingly strategic role played by partners and business ecosystems. Whether they help develop new business or operating models, augment teams with flexible resources or bring prebuilt accelerators and best practices, partners bring the speed and expertise required to be agile that is virtually impossible to replicate in-house.

    Among the main challenges faced by digital leaders when building agility into their culture, processes and technologies are creating value and growth. Although many executives (87%) say they have a strategy in place to become more digital, for example, only 20% say they are producing the expected results.

    To address this gap, effective digital leaders pivot from predicting and planning to sensing and responding. This requires the ability to connect strategy, operating model, and execution and create greater transparency and alignment. In doing so, they must understand the need to sense and respond to change quickly – they can only do that successfully when they have designed their business and operating models to be agile.

    Faced with the need to develop an agile business ethos, organizations must also recognize they cannot do this alone. Building a network of strategic partners is essential to long-term success and building the ability to adapt at pace and scale.

    EVS – Balancing cloud, on prem and edge deployments for greater production agility

    Oscar Teran,

    SVP SaaS Offering & Digital Channels, EVS


    Broadcasters at a crossroads

    In the broadcast and media industry, being able to respond quickly to evolving requirements, factors and environments is key. The lockdowns that swept the world during the Covid-19 pandemic confirmed this imperative to adapt, with many companies swiftly adjusting their broadcast infrastructures with remote and/or virtual workflows in order to stay afloat.

    These recent events have prompted a profound mindset shift. Media companies have grown less fearful of trying new ways of working and implementing new production workflows. Driven by their need for more flexibility, they have accelerated their move away from traditional setups, where resources are mainly centralized and provisioned for peak demand, towards more distributed and scalable ecosystems. The cloud is also becoming more prevalent, as broadcasters have learned to overcome their initial hesitancy and embrace the opportunities this technology provides beyond the well-known OTT and VOD applications.

    Yes, the cloud is gaining momentum, but hardware still has its place within broadcast infrastructures.  In fact, on-premises devices and applications remain essential for many real-time operations in which deterministic SLAs and high-quality live workflows are required. Budgetary considerations are also important: a full cloud setup, given a certain volume, may not always be the most effective way to respond to a high-quality production.

    So how can broadcasters keep pace and make confident decisions in today’s transforming landscape?

    Balanced Computing: striking the right balance

    For EVS, it’s about striking the right balance. Together with our customers we work on identifying the right combinations of hardware and software, and then selecting the deployment environment that best suits their needs - whether that’s in edge, private or public cloud or on-premises. This architectural philosophy, which we call Balanced Computing, gives us the flexibility to respond to more use cases in a more efficient way. In recent years, we’ve aligned our portfolio to this philosophy, to give broadcasters and media companies the power to adapt their setups to any production requirement and to help them stay relevant in the future.  

    As an example, we’re seeing a rising demand of our cloud-based solutions and services, as a way to complement existing on premises infrastructures. This is especially the case for productions of major international events that continue to rely on EVS’ live production server structure (the robust XT-VIA and the more recent software-defined XS-NEO) to ensure high-quality content ingest but who also want to take advantage of the cloud to roll out parts of their broadcast processes in a highly flexible and scalable way.

    Widely appreciated for its storytelling capabilities, the XtraMotion service can be activated on demand to generate super slow-motion replays from any camera while meeting the strictest quality and turnaround requirements. EVS operators can seamlessly clip any content from any camera, render it to super slow-motion and play it back on air within seconds for added wow factor. While XtraMotion can be entirely deployed in the cloud, production teams can also choose to run it on-premise, if they face limited connectivity in a venue or if they really want to enjoy an ultra-low latency access to the EVS SaaS offering.

    These big event productions can also benefit from deploying their delivery functions in the cloud as proven by EVS’ MediaHub SaaS content platform. Designed to empower highly collaborative workflows, Mediahub is well suited for big events as it allows rights owners to distribute content quickly and efficiently to rights-holders, who can simply ‘click and collect’ the assets they need, from any location. This feature combined with many embedded functions, gives content owners the possibility to enable virtual IBC new generation services for their diverse community of rights holders. At the same time, they can facilitate access to historical archive footage for additional monetizing opportunities.

    Furthermore, Mediahub can be integrated with popular cloud edit provider suites, providing a seamless working experience regardless of where the footage is stored, whether it’s at a remote location or a cloud bucket.

    Conclusion

    In demand for greater agility, many companies are on a mission to redefine their business models with cloud technologies. Yes, the cloud allows for flexible workflows, but legacy hardware infrastructures are still relevant in today’s landscape and intelligent decisions need to be made in terms of what should be used and where should it be deployed - this implies that EVS is required to offer both CAPEX and flexible OPEX solutions. By taking into account technical or business requirements, as well as budgetary considerations, EVS’ Balanced Computing approach allows broadcasters and media companies to swiftly adapt, improve and extend their workflows and embrace innovative tools without any disruption to their current infrastructures.

    Interra Systems – As SVOD Subscriptions Explode Globally, AI/ML and Cloud Technologies Are Key to Automating Closed Captioning, Subtitling, and Audio Dubbing Processes

    Manik Gupta,

    Associate Director of Engineering at Interra Systems


    By 2026, the number of SVOD subscriptions is expected to reach 1.64 billion globally — an increase of 491 million compared with 2021. To drive this growth, SVOD service providers such as Netflix and Amazon Prime Video are relying on international subscribers. Case in point, Netflix is now available for streaming in more than 190 countries.

    As video service providers look to globalize their content to reach untapped audiences, closed captioning, subtitling, and audio dubbing have become increasingly crucial elements of their operations. However, with roughly 6,500 different languages spoken around the world today, it is imperative for providers to take advantage of the latest technologies — including artificial intelligence (AI), machine learning (ML), and cloud-based solutions — to streamline these processes.

    Automating the Delivery of Closed Captions and Subtitles

    Historically, captioning and subtitling have been time-intensive manual processes. However, now that OTT service providers are managing a massive amount of streamed content for a global audience, the tide is turning toward automated solutions featuring AI and ML technologies that minimize captioning and subtitling costs while maximizing efficiency.

    Figure 1. AI/ML-based technology powers Automatic Speech Recognition for closed captions and subtitles

    Automatic Speech Recognition (ASR) and other ML technologies enable streaming providers to realize tremendous efficiencies in their media captioning and subtitling workflows, including faster reviewing, reduced turnaround time, and lower costs. ASR, in particular, allows video service providers to instantly recognize and translate spoken language into text, helping to streamline the creation of captions. ASR includes multiple components, offering streaming providers an all-in-one solution for the generation and QC of captions, subtitles, and audio dubbing.

    Figure 2. Cloud based Captioning Solution

    Moving to the Cloud

    The increasing adoption of cloud technologies is another key trend in video streaming. The global video streaming software market is expected to more than double over the next few years, growing at a CAGR of 18.5% to reach $17.5 billion in 2026 — up from $7.5 billion in 2021. This shift to the cloud by OTT video service providers is apparent across the entire media workflow, from encoding to QC. Using a cloud-based ASR system, they can reap all the benefits of the cloud to create captions and subtitles with increased flexibility, scalability, and cost-effectiveness.

    Automating Dubbing Workflows

    Audio dubbing is an essential part of streaming services, especially for video service providers offering content in many different geographies around the world. However, the manual dubbing of audio is a complicated process involving transcription, translation, and speech generation. Automation is key to bringing greater efficiency to the process. Through automation, video service providers can, for example, verify complex dubbing packages, including multiple MXF and .wav files, to ensure that package variations are accurate and that audio tracks are dubbed properly. Furthermore, automation can help video service providers confirm the preciseness of metadata package structures, while also checking that the number of audio tracks, channel configuration of dubbed tracks, and duration of the original audio track compared with dubbed audio tracks are correct.

    Another way the industry is tackling audio dubbing challenges is through innovations in automation and AI. Using an AI-based, automated QC solution, service providers can check the synchronization between the dubbed track and the master track with greater efficiency to identify mismatches in the timing between audio and video. This is crucial to ensuring that there are no syncing issues.

    Recent advancements in AI can also help improve the proficiency and quality of audio dubbing, especially for language identification. In recent years, the intelligence of AI/ML algorithms has improved so much that automated QC systems can now detect language in any audio track with an accuracy of more than 90%. One of the key aspects of AI/ML is that training these models only takes a few hours. After the training is over, AI technology can predict the dialect spoken in the audio track. Following this, using metadata, content creators can verify that the detected language in the audio track is correct.

    Maintaining Consistent Quality Across Different Regions

    With AI- and ML-based QC solutions, video service providers can ensure that OTT content delivered to different geographies maintains the outstanding quality today’s audiences demand. Moreover, with content going global, it is crucial to comply with strict regional and industry regulations. For instance, in the United States, AI-based QC tools can ensure content meets relevant guidelines laid out by the Federal Communications Commission (FCC), an independent agency of the U.S. federal government that regulates communications by radio, television, wire, satellite, and cable across the country. Advanced QC tools can also develop algorithms to check the synchronization between audio and subtitles in different languages.

    Final Thoughts

    Advancements in AI and ML technology are helping service providers extend the reach of their content to global audiences and capture additional viewers. With AI/ML-based solutions, they can create and QC captions, subtitles, and audio dubs with greater speed, accuracy, and at scale, without heavily investing in manual labor. AI and ML technologies ensure a high quality of experience for global viewers on every device, reducing the chance for human error. In the future, streaming providers will need to embrace AI/ML and cloud-based QC solutions as much as possible, freeing up staff to focus on creative jobs like translating difficult audio segments and adding audio descriptions.

    Three Media – Don’t focus on cloud. Focus on opportunities from new generation operating models that the cloud enables

    Debra Slater 

    Managing Director, Three Media


    At the moment there is a huge amount of noise in the media industry concerning the cloud. Vendors are clamouring to say their solution offers something new, while users are, by and large, unsure and nervous on its exact value.

    The simple view of the cloud which is often quoted – giving all our data and processing to AWS or Alibaba or one of the others – is really not helpful when trying to navigate its value. The cloud is a far greater opportunity to reshape your business than simply where you host technical infrastructure.

    We should also stop obsessing with the cloud. If we have to talk technology, what is really important is the possibility of building software-defined solutions within a virtualised ecosystem, bringing together applications from all the best media specialist vendors. That virtualised ecosystem can just as easily be on premises as in the cloud, or in a hybrid architecture.

    This solution should obviously accomplish what businesses want today, but it should also provide a clear pathway to what they are going to need in the future. It is not just about choosing to run software-defined systems on someone else’s hardware, it is much more than that. It is important to take a step back and decide what the end game is. What are you, as a business, trying to achieve? How will you take your creative talents, plus your operational and technical skills, and make money from them?

    At Three Media we talk about the next generation operating model. What the software-defined approach allows us to do is to pick just the functionality we need and put it together in a way that makes sense to our customers’ specific requirements and commercial drivers.

    This is really important. From the day, exactly 100 years ago, when the BBC launched the first public radio station until this moment, the media industry has been defined by what the technology allowed us to do. We see the software-defined revolution as allowing media enterprises to define and realise goals across technical, operational, commercial and business operations in a way that no other technology change has allowed. Indeed, the commercial re-engineering is as exciting as the changes to legacy broadcast heavy iron!

    We see this as a great opportunity to realign the media enterprise. We should all be asking questions, and deliberating answers, around how to use smart business transformation to lead to better ways of working, to re-imagine the optimal business model, then realise it.

    Along the way, it means aligning your resources in technology, in people and in work processes, to perfect your value proposition. As technology today includes artificial intelligence and machine learning, it is perfectly reasonable to build a high degree of automation into the new generation operating model, so people are used for what people do best.

    These choices, these re-architectures, are business decisions. They may well be influenced by creative or operational decisions. But they are not technological decisions: the technology is evolving to be able support the most demanding business ambitions. Simply moving your current models, processes and workflows into the cloud is a lost opportunity. The opportunity comes in reimagining and optimising your operating model.

    It is important to accept that there is no one-size-fits-all. Each media business will have its own definition of its next generation operating model, and its own pathway to get there.

    While creativity is still central to what we do and content is king, from a business perspective we have to move towards the concept of the virtual media supply chain. That approach cannot be piecemeal: it has to be glass-to-glass. But the pathways to getting there are very manageable.

    Content on its own does not give you the keys to the kingdom. Metadata carries that supporting role but in reality carries equal importance as we move towards a virtualised world. it is the backbone and engine driving the business but we have to work the data harder and we have to know what it is telling us about our processes and then use them to optimise the supply chain.

    At Three Media we are strong believers in process mining, which Wikipedia defines as “a family of techniques to support the analysis of operational processes, with the goal to turn event data into insights and actions”. Establishing it can appear complex and scary, but once it is running it can tell you all you need to know about how your business is performing, where the pinch points are, and where you can win new revenues with no extra investment.

    As a consultancy we have been fortunate to work with some of the biggest broadcasters and media companies around the world, tackling just these issues (and will be very happy to talk to many more). Our real-world experience suggests that, in the software-defined domain, the problems of interoperability and interconnectivity go away. The technology becomes the servant of the business, where solutions are led by service and customer defined needs.

    And you really do not need to throw away what you have today. Once you have decided where you want to go – in business terms – then you can structure transition at your pace to meet your business case, continually evaluating what you have achieved so you can respond to new operating opportunities and sector demands.

    You will find that the digital dividend never ends. It will give you the flexibility to evolve quickly and easily to ensure your competitive and service advantage is realised.

    We talk a lot about agility in the cloud but it is more important to be agile in the ways you tackle your goals. Do not think of this business transformation as three years of intensive analysis and coding leading up to a big bang. That is not the real route to seizing much greater opportunities. Agility is the key. Each step must be defined carefully so that once achieved it can be learnt from and count the benefits.

    Yes, we are in a transition from ground to cloud. But the important part is not the technology, it is the business transformation these new technologies allow us, in operations and in our commercial models. The new generation business model – the media supply chain – if done properly, will create competitive advantage, retain and boost audiences, control costs and maximise revenues.

    Witbe – Using Test Automation to Improve Streaming Quality and Team Agility

    Mathieu Planche, 

    CEO, Witbe


    We’ve all been there before. You’re watching the most exciting moment of the season finale of your favorite show when the video freezes. Or maybe the application crashes, or the audio drops out, or there’s a connection issue that prevents you from opening the app in the first place. In that moment, these small annoyances of the digital world are the most frustrating thing imaginable.

    The content providers, developers, and QA testers behind streaming video apps feel that frustration too. In an ideal world, every video would be delivered perfectly whenever you press play. In reality, it has never been more complicated.

    Let’s consider a single streaming app, like Netflix or Hulu. They are available on dozens of different platforms, such as iOS, Android, Smart TVs, Fire Sticks, gaming consoles, and more, which are then deployed all over the world on a wide variety of different networks. They use dozens of different frameworks, involving dozens of different companies and partners. How could a single app developer possibly test the performance on every single set-up?

    The same can be said for user interactions. Just last month, an unprecedented number of viewers tuned in to a major streaming platform to watch a live premiere, crashing the app for thousands of users. With so many factors to consider, how can streaming video providers properly prepare?

    The Importance of Quality of Experience

    It’s essential for content providers to measure the Quality of Experience (QoE) that their customers receive at home. Simply testing the quality of the videos that are stored on their servers isn’t enough. Ensuring a high QoE for viewers across every platform they use is the best way to stand out from the competition and retain customers.

    Maintaining a high QoE requires testing and monitoring. QA teams test new software releases around the clock to ensure that updates launch with as few bugs as possible. After a software release, monitoring video performance across all available platforms and networks is equally important. Even with unlimited resources, accomplishing these goals solely through manual work would be a huge challenge. This is why automated testing and proactive monitoring are so crucial.

    A Universal Approach

    Automated testing allows developers and their teams to program testing scenarios and view the results in real time. Automated technology can interact with software in the same way as human users, allowing teams to systematize certain aspects of their workflow and remain agile, while focusing more manual effort on pain points.

    A major benefit of using test automation is the ability to test more platforms simultaneously. If a QA team is testing a new software release on a Set-Top Box, they can simultaneously run automated tests to see how the same software performs on Smart TVs, mobile devices, and web browsers – thus ensuring consistent performance quality across every device. Knowing how each software release affects each specific platform is invaluable.

    Reducing Stress

    Automated testing is also valuable for replicating user behavior in ways that human testers cannot. It’s difficult to manually test how a video streaming app is performing after fifteen hours of continuous use or constant channel changes. This type of usage is inevitable when the software is widely released. Automated testing helps cover the stress, endurance, and performance testing that is impossible to accomplish manually.

    This approach also allows teams to better divide their focus. When technology is tackling the most tedious and taxing parts of the workflow, team members can turn their efforts to the more hands-on aspects of testing. In other words, operators are achieving results around the clock – even while the team is sleeping – and resources are used more efficiently throughout the process.

    Staying Proactive

    Proactive monitoring technology uses a similar approach to measure QoE on publicly available software: It can run on physical devices located in any market you are testing. Using unique algorithms, it analyzes video streams in real time and assesses them on the same criteria as a human user would.

    Proactive monitoring is particularly helpful in the event of a service disruption or interruption. The technology can immediately send alerts that an asset is not available or that the streaming quality has dipped, allowing video operations teams to resolve this before customers even notice. This reduces the usual time and resources it would take to identify, and then fix, an issue. Staying proactive is the best defense in combatting service interruptions.

    Work More Efficiently, Not More Hours

    Maintaining a high QoE for your customers isn’t easy, but it’s the key to long-term success. Teams working on video streaming content can stay agile by relying on automated testing and proactive monitoring. It allows operators to divide resources more effectively to ensure their software runs well on a wide variety of platforms and networks.

    Remember that feeling of frustration when the perfect video moment was ruined by an app crash? How different it would have been if everything had gone smoothly. Test automation and proactive monitoring help streaming video providers get closer to that goal and work more efficiently along the way.