TMT Insights – Flipping the Script: Media and Entertainment Companies Re-evaluate to Achieve Business Resilience

Andy Shenkler, CEO & Co-Founder, TMT Insights

The broadcast industry – and media and entertainment (M&E) overall – look very different than they did five years ago, and it’s near impossible to accurately predict what they will be in five or 10 years. After all, if you had asked M&E executives or any executives for that matter in February 2020 to predict the near-term future, it’s highly unlikely that any would have said that a period of unprecedented growth in consumer consumption and a world focused on remote accessibility would be undertaken just weeks later. Who knew?

“Who knew” is the main reason for a heightened focus on business resilience: an organization’s ability to adapt and pivot in the face of industry disruptions while maintaining continuous business operations and protecting its people, its customers, intellectual property, assets, and brand reputation.

Media and Entertainment is one of the fastest-moving industries, with an accelerated rate of technology development constantly driving new innovations across the content landscape. The past few years have seen media organizations reimagine every aspect of their operations to continually find new ways of attracting, engaging, and retaining audiences that have more viewing and content options than ever. Fueling this unprecedented choice is the proliferation of diverse streaming services, whether it’s new VOD platforms, FAST channels or new services launched by traditional broadcasters.

Now, after such a prolonged period of rapid expansion and development, M&E companies are faced with increasing budget pressures, emerging technologies and workflows, audiences with higher entertainment expectations and increasingly discerning tastes. The global demand for diverse media across multiple platforms in multiple formats is increasing and showing no signs of slowing. While it’s true that more content can create more business, it can also strain M&E operations almost to a breaking point.

These shifting industry dynamics are coupled with the significant macroeconomic trends occurring globally over the past few years: economic uncertainty amid concerns over a coming recession, supply chain breakdowns, rising utility costs, geopolitical conflicts, and the changing nature of work as organizations struggle to strike the right balance of onsite/WFH for their needs while meeting the changing expectations of their increasingly distributed workforces.

If for the past few years companies got to enjoy the honeymoon of the content boom, then they are now starting to move into a sustainable and long marriage with their customers and it’s time to take a hard look at their business models focused on long term benefits for everyone. We’re seeing this happen more and more across the customers we support, and we’re not alone.

A recent study by Deloitte notes that continued change in the media and entertainment business is a certainty, as “studios and video streamers face the reality of their own market disruption, trying to find profits in a less profitable business,” not only competing with each other for attention, time, and revenues, but also with social media, user-generated content, and gaming.

Which brings us back to the focus on business resilience. An Ernst & Young survey reported that “37% of executives said that without reinvention their companies would cease to exist in five years” – and that was before the pandemic; those numbers have only risen in subsequent post-COVID surveys. A recent poll by SAS found that organizations value business resilience if they are to survive, and even thrive. However, while 97% of business executives believe having a strategy in place is important, only 47% believe their organization is properly prepared.

The state of the industry

In our current world, business resilience is linked to the trend of organizations paring down in the face of massive layoffs as they deal with the impact of the last five years, which saw a massive acceleration of technology buildouts. We’re starting to see a transition now from a time when companies were in an arms race to develop content infrastructures, essentially throwing money and bodies at everything and everywhere to move as quickly as possible. Now there’s an increasing urgency to regain fiscal control, even as they serve a consumer base that’s continually expecting more.

Organizations of all sizes are wrestling with how to proactively optimize their financial forecasting and some even looking to return to more traditional CAPEX economic models that offer some level of predictability without requiring massive recurring costs while still maintaining a cloud presence.  Business leaders in nearly every vertical and market sector are refocusing their priorities to remain as agile and forward-looking as possible, often fundamentally changing how they operate to deliver maximum efficiency.

The potential benefits certainly include significant cost savings and greater operational efficiency, but they also extend to greater resource management, enhanced customer insights, increased productivity, profitability, and agility.

Business resilience is key to organizations gaining a competitive edge by adapting quickly and acting strategically. The fundamental challenge is building a team that can continue to support that level of ongoing growth with an infrastructure capable of avoiding costly downtime, identifying and preventing vulnerabilities, and maintaining business operations in the face of unexpected intrusions.

True business resilience requires a holistic mindset, taking into consideration every aspect of an operation: security, business, and data protection, even shifting behaviors when it comes to technology funding sources.

Previously, low interest rates made funding easily accessible to media organizations and tech start-ups. Raising debt was relatively painless because, with interest rates so low, companies were basically borrowing free money.

That’s not the case anymore. Now there is more scrutiny over profits, resulting in less funding for tech companies and a change in how companies invest their money, carefully examining where they put every penny. There’s clearly more pressure on teams to make do with less.

As companies are doing everything possible, within financial reason, to ensure the business resiliency of their organization, they can’t lose sight of the resiliency of their customer base and how they can defend themselves from sharp changes in consumer behavior. At the end of the day, everything else is unimportant if you don’t have your customer base secured.

Criticality and risk tolerance

How many businesses really understand their risk tolerance? Do you know yours? Achieving resilience requires an understanding of the criticality of processes, assets, data, and user information, and the level of disruption a brand is willing to tolerate. Then, businesses can properly evaluate the corresponding business impact on the organization and their external audiences: customers, employers, partners and shareholders.

This often means asking some tough questions, starting with: “What does the recovery process look like for us after a catastrophic event?”, “What is the timeframe for that recovery?” and “How long can I live with the effects of that catastrophic event?”

As companies increasingly rely on third parties like Amazon or Google as their infrastructure, they are conducting more careful risk assessments of their entire operations. Many are also adopting an approach of “risk-based resilience,” recognizing that not all business processes, capabilities and services are as critical as others and don’t warrant the same level of response.

Now more questions need to be asked: “Are we okay with being down for six hours if that’s the amount of time it takes us to get back up, or are we okay with being down for an hour but maybe we lost the last three hours of data?” “How long does it take me to bring everything back up again without having to spend a fortune while still providing my core services to my key customers?”

Those are the types of trade-offs people are confronting.

Over the past few years, we’ve all experienced major culture shifts, and learned new definitions of the word “resilience” as we’ve adapted how we conduct our personal lives, social interactions, work habits, and even our expectations of what long-term success and growth look like.

Business resiliency is no different, requiring a flexible mindset and a willingness to change, and having the right plans and the right partners, to weather any storm.

 

LTN – Creating a roadmap to success in the new digital media era

Brad Wall, CTO, LTN

The media industry is experiencing a paradigm shift. In the digital-first era, competitive advantage is driven by digital strategies. Media companies increasingly leverage IP technology to define new business models and tap into more audiences across over-the-top (OTT), digital, and free ad-supported streaming TV (FAST) platforms.

In a competitive and fragmented media landscape, media and tech players are aggressively targeting the streaming market to get a piece of the pie. High-value content still differentiates the winners from losers, but that’s not enough. Media organizations need to ensure that their business is future-proofed and that they can make the most value out of their high-profile content to boost their bottom line today and in the future.

Winning the premium content battle

Global audiences are hungry for premium content. This content creates loyalty, stickiness, and new monetization opportunities with the right global distribution and regionalization capabilities. Media businesses bringing top-tier live events to cross-platform audiences worldwide while meeting the demand for more curated content experiences, such as niche sports and entertainment, are set for success.

Efficient and reliable global distribution and regionalization have become mission-critical capabilities. They are possible thanks to the latest IP innovation. Previously, delivering niche, regionalized content to global viewers required heavy investment in dedicated technology infrastructure with little reward. However, media companies can leverage global distribution and seamless regionalization capabilities with the right IP-based multicast video transport network and versioning solutions. As a result, they can bring niche content to new and diverse international markets efficiently, opening the door to global growth and new revenue streams.

Effective regionalization and growing audience reach mean more bang for media companies’ buck, especially when it comes to costly sports rights and licensing fees. Top-tier sports rights is the key battlefield for several players, from established media brands and streaming services to tech giants. Customizing live sports content for cross-platform, multi-geography audiences reliably and at scale is the revenue driver industry players can no longer afford to ignore.

Fueling monetization opportunities

 For the media industry, content has always been king. Nonetheless, the value of premium live content has never been greater, paving the way for a key revenue stream: advertising. Broadcasters and rights holders know that premier live sports are gold in the advertising world. Investing in high-value content assets, like premier live sports, means tapping into a global audience in the billions. The monetization potential is unlimited.

At the same time, increased investment is fueling creativity in the media to capture the consumers’ attention and boost engagement. Content creators turn to gamification and immersive technologies to create unique and compelling viewing experiences, including personalized streams and game-like experiences. This is the time for media businesses to leverage efficient ways to create, version, deliver, and monetize high-profile content as well as capitalize on growing market opportunities.

 Riding the digital tide with the right tech partner

Scaling premium live content to digital audiences globally on any device and platform is a key competitive differentiator for media brands. Being able to create, version, monetize, and deliver premium live content at scale is a real test for media brands, requiring best-in-class infrastructure, advanced software workflows, and dedicated expertise. New market entrants and traditional media players face the challenge of not having the right tech infrastructure to deliver the required content tonnage and customization to meet the growing demands of today’s media market. Existing tech toolsets that aren’t ‘digital-ready’ can’t cut it and are simply slowing down industry players at a critical moment in the industry’s transition to IP and the cloud.

Moving to a digital-first mindset isn’t straightforward. The digital journey requires careful planning, mapping, and deep industry expertise to ensure it’s not about throwing new technology into the mix and expecting it to magically deliver ROI and revenue growth. Media companies are businesses, and as such, they want to focus on what they know best: their content, audiences, and business differentiators, not technology.

This is why media companies need the right tech partner to navigate change and drive business growth. Investing in future-proofed technologies that can quickly and seamlessly adapt to market and consumer consumption shifts is a recipe for success for today and tomorrow.

 

 

 

Iyuno – Business Resilience in Media Localization and Iyuno’s Services

By Allan Dembry, Chief Technology Officer at Iyuno

 In today’s rapidly evolving world, businesses face a variety of challenges that can impact their operations, from supply chain disruptions to economic uncertainty. For companies in the media localization industry, these challenges can be particularly acute, given the need to navigate a complex and rapidly changing landscape of technologies, standards, and content formats.

The media and entertainment industry is constantly evolving, with technology and artificial intelligence (AI) playing a significant role in disrupting workflows and creating new service offerings and revenue streams[1][2]. In this context, business resilience is crucial for companies like Iyuno, a leading media localization provider offering technology-driven end-to-end localization solutions for various platforms[3].

Technology and Activities Boosting Business Resilience

Security and Data Privacy

One of the most pressing concerns for any business in today’s digital age is security and data privacy. With the rise of cyber threats and other forms of digital disruption, companies must be vigilant in protecting their data, systems, and intellectual property. This is especially true for companies in the media localization industry, which handle sensitive content and intellectual property on a daily basis.

At Iyuno, we take security and business continuity very seriously. We have implemented a range of measures to protect our systems and data, including robust firewalls (NGFW), end-to-end encryption protocols, AI-driven network monitoring, cloud security best practice, extensive staff training, and regular security audits, along with a host of proprietary software, hardware and workflows that protect our clients’ intellectual property. In addition, we have established redundant systems and backup processes to ensure that our operations can continue in the event of a disruption.

Another key concern for businesses in the media localization industry is data privacy and protection. With the increasing amount of data being generated and shared across the industry, it is essential that companies take steps to safeguard this information and ensure that it is being used in a responsible and ethical manner.

As the demand for localization services grows, the need to ensure the security and privacy of clients’ data is paramount[6]. This includes implementing robust data protection measures, such as encryption and access controls, to safeguard sensitive information and comply with data privacy regulations[7].

Business Continuity and Supply Chain

In today’s globalized economy, disruptions to the supply chain can have a significant impact on businesses of all kinds.

At Iyuno, we have established a robust supply chain management system to ensure that we can continue to deliver high-quality services to our clients, even in the face of disruptions. Due to our extensive network of owned and operated facilities, the largest of any media localization company, we are far less reliant on external partners to maintain capacity and supply. Our strategy also includes working closely with our partners and suppliers to identify potential risks and develop contingency plans to mitigate them.

Business continuity is essential for media localization providers to maintain their operations during disruptions, such as supply chain issues or economic downturns[8]. Localization companies can achieve this by diversifying their supply chain, investing in redundant infrastructure, and implementing contingency plans to minimize the impact of disruptions[9].

Standardization and Interoperability

As the media localization industry continues to evolve, one of the key challenges facing companies is the need to ensure interoperability and standardization across a wide range of technologies, formats, and platforms. This is essential to ensure that content can be delivered seamlessly across multiple channels and devices, and that it can be accessed by audiences around the world.

Standardization and interoperability are crucial for media localization providers to ensure seamless integration with their clients’ systems and workflows[10].

Inflation, Scarcity, and Complexity

Being prepared to navigate a range of macroeconomic challenges, including inflation, scarcity, and complexity is also key. This requires a deep understanding of the market and a willingness to adapt and evolve in response to changing conditions.

When the landscape is evolving and changing, Iyuno’s philosophy is to seek new opportunities and evolve what we are offering so we can always find transformative methods of bringing value to new and existing clients.

Inflation, scarcity, and complexity are challenges that media localization providers need to address to remain competitive and resilient[5]. By investing in advanced technologies, such as AI and machine learning, we can optimize operations, reduce costs, improve quality and adapt to changing market conditions[2].

Current Trends in the Entertainment Industry

Streaming Restrictions and Contractions

The streaming landscape is experiencing contractions, with companies limiting growth on content spending and laying off employees[4]. This trend highlights the need for media localization providers to adapt to changing market conditions and focus on delivering high-quality services that cater to the evolving needs of their clients[12].

Growth of Niche Streaming Platforms and Expansion of Content Libraries

Niche streaming platforms are on the rise, offering specialized content to cater to specific audience preferences[1]. As a result, media localization providers need to expand their services to support a wider range of languages and formats, ensuring that their clients can reach diverse audiences worldwide[13].

Integration of Live TV and Music Streaming Services

The integration of live TV, podcasting, online content and music streaming is another trend shaping the entertainment industry[1]. Media localization providers need to adapt their services to support these new formats, ensuring that their clients can deliver a seamless and engaging experience to their audiences[14].

The media localization industry is facing a range of challenges and opportunities in today’s rapidly evolving world. By focusing on these key areas, companies can build resilience and adaptability into their operations, ensuring that they can continue to deliver high-quality services to clients around the world.

Citations:

[1] https://blog.gitnux.com/entertainment-trends/

[2] https://vistatec.com/artificial-intelligence-and-localization/

[3] https://iyuno.com/services

[4] https://www.bloomberg.com/news/newsletters/2022-05-22/layoffs-and-cost-cuts-usher-in-a-new-era-for-streaming

[5] https://www.ey.com/en_us/tmt/five-trends-to-watch-in-media-and-entertainment

[6] https://techcrunch.com/2021/03/17/how-localization-leader-iyuno-media-group-opens-entertainment-to-international-audiences/

[7] https://www.mckinsey.com/capabilities/risk-and-resilience/our-insights/localization-of-data-privacy-regulations-creates-competitive-opportunities

[8] https://www.businesstoday.in/opinion/columns/story/heres-how-to-unlock-business-resilience-through-localisation-341109-2022-07-11

[9] https://www.alertmedia.com/blog/business-resilience-2022-webinar-recap/

[10] https://www2.deloitte.com/us/en/pages/technology-media-and-telecommunications/articles/media-and-entertainment-industry-outlook-trends.html

[11] https://www.welocalize.com/ai-dubbing-the-future-of-multimedia-localization/

[12] https://www.transperfect.com/blog/8-trends-will-shape-media-and-entertainment-industry-2023

[13] https://commit-global.com/the-effect-of-technological-disruption-on-localization-services/amp/

[14] https://www.smartcat.com/online-events/lfh_talks_talkslfh3/

 

 

 

Edgio – How the buy vs build challenge can be leveraged for success in today’s market

Eric Black, CTO, General Manager of Media, Edgio

The current economic situation has meant many viewers are looking carefully at how much they’re spending on subscriptions, which has a knock-on effect on the whole industry. At the same time, media companies are facing an increase in the cost of creating content. These challenges are forcing them to re-align their business models to prevent operating spend and maximize average revenue per user (ARPU); the latter leading to experimentation around varied monetization methods, such as subscriptions, ad-funded, free ad-supported TV (FAST) syndication methods and in many cases a hybrid approach. They must also identify any optimizations in their technology stack to find cost savings for the escalating cost of buying or creating high-value content and drive cashflow-positive business models in a rapidly evolving landscape.

In recent years, it has been fairly common practice for the industry’s largest organizations to have their tech in-house to capitalize on the “streaming wars”. However, we are now at a stage where delivering a best-in-class experience to viewers is becoming increasingly complex and costly. To create greater efficiencies in their technology while remaining competitive, many media companies are turning increasingly towards third-party vendors. In this article, I’ll explain why.

Flexibility vs. complexity

Flexibility is essential in today’s market. The current economic climate means audiences may not be willing to pay for multiple subscriptions; as such, media companies are experimenting to find the best way to deliver and monetize their content both on 1st and 3rd party platforms. There’s no one-size-fits-all approach as each streaming service has its distinct audience and priorities – and must try out different options to find its winning strategy for growth and ARPU.

Some may need to offer tiered subscription/advertising models; others may need to utilize FAST channels; others may charge more for premium features, like 4k or Dolby Atmos. Technology never stands still, and with the challenges of multi-platform support, budget restrictions, and changing audience demands adding to the difficult economic landscape, flexibility is key. The market is evolving so quickly that media companies must have flexibility built into their tech stack from day one to adjust strategies at short notice. A benefit of outsourcing some of these functions is that it’s the job of partner specialists to anticipate changing demands, operating efficiency, feature functionality and prepare for this – and pick up the associated cost, which can be spread across several of the vendor’s customers instead of one media company going it alone and absorbing all the cost.

Time-to-market is business-critical

Another benefit of outsourcing specific technology is that partner specialists can offer a quicker time to market leveraging technology investments already made, which, in business terms, is crucial for securing audience and revenues. With a partner vendor, the baseline tooling already exists, and the expertise is there to customize, deploy and manage it. The in-house route requires engineers to build a lot of technology from scratch, which takes time and tends to result in delays and overspending. For media companies looking to establish themselves in the streaming market, time-to-market is critical.

Best in breed technology

Leveraging partner technology in your digital workflow adds a best in breed approach to technical deployment. Learning from a larger pool of client workflow is integrated into “world class” solutions that add to cost efficiency, higher level of reliability and increased resiliency. Taking advantage of open frameworks to identify the proper technical solutions that are fit for purpose has proven in countless software deployments spanning multiple industries to deliver better customer results than bespoke build. Identification of what IP should be owned vs partnered on will be a critical decision OTT providers will continue to re-evaluate as we continue to evolve as an industry.

Media companies as an SI

Even if streaming technology is built in-house, media companies with largely owned tech will likely be stronger in some areas than others. Larger firms with big engineering teams will act as a system integrator (SI) for the full end to end technical stack. Advantages exist for external thought leadership from trusted partners and SI’s to help optimize not only 1st but 3rd party workflow and technical integrations based on all aspects of the technical and revenue equation.

Others need a larger relationship with a trusted partner to take on the SI role and provide a complete end-to-end solution based on existing upstream systems and workflow. In both cases, flexibility is essential to allow variation in revenue strategy over the long term. With trustworthy partners in place, media companies can remove all of the technology headaches and help their operators sleep at night. These partners must provide flexibility and operational robustness in equal measure by having effective and efficient technology that will support their customers’ goals, allow them to focus on what they do best – great content – and enhance their brand.

Taking the next steps

After weighing up the factors of the buy vs. build approach, media companies are in an ideal position to mitigate the hurdles they are facing today. Ultimately, media companies must decide what they think is the best approach for their audience and their business and the most effective way of delivering it. With the right strategy in place, there is every reason to believe streaming services can navigate short-term economic pains and be successful over the long term.

 

DW Innovation – Exploring Transparent Robustness for AI Powered Media Tools

Birgit Gray, Innovation Manager

 Broadcasters and media companies continue to implement systems powered by Artificial Intelligence (AI) across different workflows. While there is increasing awareness for AI components to be trustworthy in terms of transparency, explainability or fairness – when it comes to trust, aspects such as security and resilience are also important.

Birgit Gray from DW Innovation provides insight into the concept of AI Robustness and her cooperated work in the AI4Media project to make an AI component for one of DW’s media tools more robust and therefore more trusted.

What is AI Robustness?

Robustness refers to the overall resilience of an AI model against various forms of attacks.

AI powered media systems could be subject to attack by malicious actors, who aim to manipulate the system’s AI model to take control or influence its behaviour and results. For example, an adversary could obtain access to the deployed model of the AI component and perform very minor imperceptible alterations to the input data to significantly reduce the accuracy of such a model. Robustness Evaluation entails an assessment of the model’s vulnerability to different types of attacks as well as the testing of defence mechanisms to see how well they perform under attack and how they change the model’s level of accuracy.

There are different types of adversarial attacks on AI systems. They can relate to the intent and technique employed by the attacker (such as evasion, poisoning, extraction, or inference), or they can be categorised by the attacker’s level of knowledge and access to the targeted AI system. In the case of a white-box attack for instance, it is assumed the attacker has obtained full access to and good knowledge of the AI model. For black-box attacks, such access is limited but the attacker can still succeed in influencing the model by sending information to it and receiving information from it. For example, a public, or otherwise accessible API of the service, would facilitate an attacker to query it with some input data and receive a result, thus enabling them to determine inputs that could deceive the model. Whilst the risk for white-box attacks can be reduced by implementing respective IT security provisions to restrict access to and knowledge of the model, the same provisions cannot be extended to protect against black-box attacks, which will always require consideration whenever a model is being deployed.

 Robustness and Trustworthy AI

Robustness is one of the Trustworthy AI principles, together with Transparency, Fairness, Privacy, Explainability and Governance. These principles are designed to increase AI-related trust, user acceptance and security, but also support managerial assessment, AI implementation, and legal compliance. The trustworthiness of an AI component can be enhanced by applying algorithmic trustworthy AI tools to the AI model and by supplementing such evaluations with documentation in the form of accessible Model Cards or Fact Sheets which openly discuss the vulnerabilities and limitations of the model in an understandable and meaningful way.

Robustness is an important stand-alone element but is also connected with other principles such as Governance and Transparency. After conducting a Robustness Evaluation at the component level, it is important to also provide respective documentation in a Model Card or Fact Sheet. Only this will achieve Transparency for end-users and managers regarding the level of resilience and security of an AI component. It also enables Governance at the corporate level, for example when assessing the compliance with AI Guidelines.

Image by Google DeepMind auf Unsplash

Towards Transparent Robustness in AI4Media

Deutsche Welle (DW) runs a practical use case in the EU co-funded AI4Media project together with ATC iLab. Based on our requirements, we received a set of advanced AI functionalities from several technology partners in the project for integration and testing in a demonstrator version of the Truly Media platform for content verification. This included an AI powered Deepfake Detection service from the MeVer Group at CERTH-ITI (CERTH).

While the prediction given by this AI service can assist users with verifying a suspicious video, the final decision of whether it is synthetically generated or manipulated remains with the human analyst in a media organisation. For this reason, it was DW’s goal to explore how the level of trust in this new component can be increased, for end users such as verification specialists, but also media managers who need to assess new AI components in the context of AI Guidelines or for commercial integration into an existing media tool. Trust also relates to knowing how robust the Deepfake Detection service stays in case it is the subject of an adversarial attack.

Following DW’s requirements, the Deepfake Detection component was evaluated and enhanced in terms of Robustness by the component owner CERTH in close cooperation with the expert partner in AI4Media for algorithmic trustworthy AI technologies, IBM Research Europe – Dublin (IBM). DW then developed requirements to assist IBM in producing the right kind of transparency information: technical information for the component’s Model Card and more business-oriented input for a co-authored User Guide for media managers in non-technical language.

Seven Steps: Robustness Evaluation and Related Transparency

The workflow to conduct a Robustness Evaluation and develop suitable transparency information consisted of seven steps, which are summarised below:

1. Ensuring that there is a technological match between this AI component and IBM’s algorithmic robustness evaluation tool.

2. Integrating IBM’s open source Adversarial Robustness Toolbox (ART) into the processing pipeline of the Deepfake Detection component.

3. Subjecting the datasets used by the model of the Deepfake Detection service deliberately to a white-box attack, using a Projected Gradient Descent attack and measuring the results.

4. Conducting a black-box attack, using a HopSkipJump Here it is simulated that an attacker influences the output predictions of the Deepfake Detection service for a specific input video, by sending information to and receiving information from the model. Therefore, slowly learning which alterations of the Deepfake video are required to evade detection by the AI service.

5. Identifying and testing defense mechanisms such as JPEG Compression or Spatial Smoothing that may protect the model from adversarial attacks but can also impact on the model’s accuracy levels.

6. Describing the simulated attacks and their influence in the Model Card of the Deepfake detection service to make results of the Robustness Evaluation

7. Developing a User Guide for managers in non-technical language that allows for an assessment of the component’s level of resilience/security, by explaining AI Robustness, the stakeholders and processes involved for this AI component, possible security scenarios and the outcome of the Robustness Evaluation.

Lessons Learned

This explorative work in the AI4Media project showed the value of assessing vulnerabilities and the robustness of AI components used in media tools. Describing the results of such a Robustness Evaluation increases the level of trust in an AI component, especially related to the Trustworthy AI principles of Transparency and Governance. Such descriptions can be provided for both technical audiences (e.g., in a Model Card), but also for other stakeholders in the media organisation, using a business-oriented approach and non-technical language.

Another learning was related to the many stakeholders involved when an AI component is developed by a third party, deployed in a media tool that is operated by an external technology provider and then used within the media organisation. This demonstrated potential security interfaces in the context of attack scenarios, but also the need for tailored AI transparency information for different target groups.

About AI4Media

With 30 European partners from the media industry, research institutes, academia, and a growing network of stakeholders, the EU co-funded AI4Media research project has several dimensions: it conducts advanced research into AI technologies related to the media industry, develops Trustworthy AI tools, integrates  research outcomes in seven media related use cases, analyses the social and legal aspects of AI in Media, runs a funding programme for applied AI initiatives and establishes the AI Doctoral Academy AIDA.

For resources from the AI4Media project, visit the results section on the project’s website, containing White Papers from the use cases, an in-depth report on AI & Media, as well as specific reports on legal, social, and trustworthy AI aspects. The project also provides open data sets and AI components via Europe’s AI-On-Demand Platform.

CGI: The crucial role of Business Agility in driving sustainable transformation

Leveraging Technological Advancements for Sustainable Broadcasting

Michael Thielen, Vice President, Radio Solutions, CGI

The rise in importance of sustainability in the Broadcast industry has been one of the most significant meta trends in recent years. Driven onwards by both public and investor sentiment, and given further impetus by the increasingly observable effects of climate change, sustainability has become central to many industry roadmaps.

Even with recent survey data indicating that businesses will tend to choose sustainable options only where there is a perceivable ROI, technological developments over the past decade have made that easier to achieve. Many of the systems, processes, and technologies that underpinned the Broadcast industry have been jettisoned in favor of more efficient, agile, and — almost by chance — sustainable alternatives. Procurement practices that would have thought nothing of installing large machine rooms and adding as much power and cooling as necessary have, by economic necessity, been replaced with a more sustainable outlook. These might be driven by the twin engines of lower costs and faster ROI, to begin with, but the consequence of less energy use is the same.

 Equally, the wider industry ecosystem has evolved. ESG (Environmental Social Governance) has replaced CSR (Corporate Social Responsibility) and provided a more accountable method of assessing company performance in all areas relating to the wider ecosystem that a company operates in. Increasingly these policies are being seen to impact at both the macro level and the micro level, everything from a company-wide focus on reducing carbon costs of travel budgets to assessing the CPU load of individual processes in competing software packages.

With the industry at a critical inflection point represented by both the rise of streaming models and the pivot towards IP-based workflows, there is a clear opportunity to ensure that its future progress is built on more sustainable policies than the SDI era. However, for that to be truly effective a holistic overview is required, and that is why business agility is a critical component in ensuring the sustainable future of the industry.

The importance of business agility

So, what do we mean by business agility? Essentially it is the ability of an organization to be able to rapidly adapt to market and environmental changes in ways that are both productive and cost-effective. The agile organization understands that it is not a monolithic entity operating in isolation but is enmeshed in a complex series of flows that are constantly dynamic. It is able to quantify these changes, even predict them, adapting itself to the new environment and being perfectly placed to take advantage of any new opportunities that it presents.

According to the Agile Business Consortium, “agility in an organization’s culture, leadership, strategy, and governance . . . adds value to all stakeholders who operate in uncertain, complex, and ambiguous environments.”

“Uncertain, complex, and ambiguous environments” is almost a definition of the post-pandemic broadcast industry as it moves towards a streaming-first future. The point is that by being agile, companies can respond to changing ESG requirements and opportunities far more quickly than those using traditional business models. As part of that they can swiftly implement policies that have a positive impact on the environmental aspect of their operations.

Mention of the pandemic is appropriate too. We are used to the acceleration that COVID-19 produced to the deployment of various new models across the industry. The mainstream adoption of the remote production of live events, the transition to the cloud, hybrid working models and more have all happened several years more quickly than they would have done if the pandemic had not taken place. But it is important to acknowledge that the speed with which organizations had to adapt to a shifting patchwork of global lockdown regulations to remain in business — acting swiftly, prioritizing core business activities, collaborating online, and innovating to solve new and extremely challenging problems — has also fundamentally rewritten corporate culture.

Promoting a Culture of Sustainability

Many organizations have seen significant improvements in leadership and governance as a result, with a newly empowered skilled workforce, particularly amongst the Millennial cohort, looking closely at a company’s ESG record when it comes to choosing between employers. Adopting new approaches and business practices, including sustainability, has transitioned from being a value add to a necessity as a result.

This starts from the obvious, using renewable energy sources or accelerating recycling programs, to more strenuous actions such as making a commitment to hitting net zero by a certain date. The agile company’s ability to collaborate and share information effectively, both internally and externally in their networks, allows them to identify and address sustainability issues more efficiently, as well as cope when criteria change such as the recent shift from carbon neutrality via offsetting becoming a waypoint rather than the ultimate destination.

The changes the industry is going through will involve a fundamental retooling of equipment and workflows over the next decade. Now, therefore, is the perfect time to implement policies that place energy efficiency higher up the list of tech purchasing criteria as inefficient legacy solutions are replaced, as well as examine and overhaul workflows to minimize the carbon footprint of an organization’s employees, which is typically a consequence of travel and commuting. Adopting web-based applications, setting up connected workgroups, and ensuring seamless data exchange can reduce this, but the organization has to be open to the changes they represent. If they are, they will often find that there is a bottom-line benefit to the new policies, delivering them a rare win-win of increased environmental sustainability and ongoing business success.

 

Agama: From data to sustainable actions: How video service providers can drive sustainability with video analytics tools

Dora Voicu, Marketing & Communications Director, Agama

Sustainability has become a top priority for companies across all industries as the world becomes more aware of the disastrous effects of climate change. Video service providers are no exception; however, they need to identify ways to improve energy efficiency, reduce waste, ensure regulatory compliance, and unlock new business opportunities. In recent years, there has been a growing concern about the environmental impact of video services. Companies are now seeking sustainable ways to deliver their services while reducing their carbon footprint, and this is where sustainability analytics comes in. By collecting and analysing data, video services providers can gain insights into their environmental impact and develop strategies to mitigate it.

What is sustainability analytics?

Sustainability analytics is a tool that companies can use to assess their environmental impact, identify areas for improvement, and assess the performance of sustainability initiatives.

Sustainability analytics can use internal and external data sources, allowing video service providers to create a comprehensive view of their impact on the environment. The data collected can help operators understand their carbon footprint and identify the areas that require improvement. They can use the data collected to develop a roadmap for sustainability that will guide their decision-making.

The negative environmental effects of video consumption

Global video streaming is significantly influencing greenhouse gas emissions. According to one piece of research, the sector currently accounts for around 1% of total emissions, which is anticipated to rise by 2025 if no serious action is taken. Due to smaller screen sizes and increased energy use for data processing and transmission through cellular networks, studies have indicated that streaming video on mobile devices produces more CO2 than viewing on a TV. To solve this expanding environmental threat, more awareness and action are required.

In general, the video services are primarily delivered over the Internet. The servers that host and deliver these services use large amounts of energy, leading to significant carbon emissions.

Additionally, the waste generated from the production and disposal of electronic devices used for video consumption has a significant negative environmental impact. They can contain hazardous materials that are harmful to the environment and human health.

Agama’s Sustainability Analytics initiative for sustainable video operations

The Sustainability Analytics initiative developed by Agama is an important step in assisting video service providers to understand and reduce their impact on the environment. These tools examine many aspects of a provider’s operations, from the amount of energy used in data centres to the in-home devices.

On a broad level, Sustainability Analytics aims to help video operators in making data-driven choices to reduce their environmental impact. Our goal is to help video service providers evaluate the environmental effect of their services, devices and applications, and identify alternative actions that can influence them. To do this, our initiative analyses consumption indicators in order to give insights into energy use trends and breakdowns over time.

As an example, by looking at the total use of specific services across different types of devices, the energy impact of changing the coding or bitrate can be understood, and thus be part of a structured process to continuously reduce its negative effects on the environment.

Another important aspect is to support tracking and reporting energy use over time for ESG and regulatory compliance. We can track the success of sustainability projects while monitoring real-time energy usage, suggesting areas for improvement, and identifying inefficiencies.

Earlier this year, we made significant progress towards our goal of increasing environmental sustainability within the video streaming industry. In addition to developing our own sustainability analytics tools, we made a strategic decision to join the Greening of Streaming (GoS) organisation. This move reflects our firm commitment to drive real change and support environmentally conscious practices across the industry. As a member of the GoS community, we are joining a dedicated movement of like-minded organisations working to build a more sustainable future for everybody.

Sustainability analytics can also help a company identify inefficiencies and areas for improvement, leading to more efficient operations. Additionally, collecting data on sustainability initiatives can help to track its progress and identify best practices that can be replicated across the industry.

The negative environmental impact of video consumption is an issue that video service providers cannot ignore anymore. As video consumption continues to increase, the need for sustainable methods of delivering video services becomes more urgent. Sustainability analytics tools can help video service providers reduce their environmental impact, increase cost savings, and improve their brand reputation.

We can certainly say that video consumption has a huge environmental effect and cannot be disregarded. As end-users expect more engaging and high-quality video content, video service providers must ensure that their operations are environmentally friendly and sustainable. By adopting sustainability analytics, video service providers can take an active role in safeguarding the environment for future generations.

 

 

Accedo: Assessing the Impact of OTT UI/UX Strategy on Energy Consumption

François Polarczyk, Sustainability Director, Accedo

 To achieve net-zero emissions by 2050, the video industry needs to consider every aspect of its complex ecosystem, and that includes reducing the energy consumed by end user devices when streaming content. There’s been a lot of focus over the years on developing electronic devices to be as energy efficient as possible, as well as improving the energy efficiency of the overall media chain (servers, encoders, cloud storage, etc). What’s less well known is what impact changes made at application level can have on energy consumption when content is streamed. And crucially, what impact do changes made at application level for the purpose of reducing energy consumption, have on the User Experience (UX)? This is a critical factor to establish because UX is a central pillar of any successful video service. Because after all, if video providers have to make a choice between UX and sustainability, UX will win every time. But does UX have to be compromised in the quest for a more sustainable application? Or is it possible to reduce energy consumption on OTT devices by optimizing the application and using energy-efficient UI/UX strategies, all without sacrificing user experience?

 You can’t answer these questions without investigating how the energy efficiency of devices is impacted by making changes at application level, whether from an engineering or UX perspective. To gather information in this important area, we recently embarked on a project exploring sustainability in video operations. We compared the energy usage of various streaming devices along with several popular streaming applications, and we tested various scenarios at application level, implementing changes around bitrate, resolution and dark mode versus light mode, to evaluate subsequent energy variations. We also explored UI and UX techniques that can contribute to improved energy efficiency.

User engagement and interaction

 Video streaming services are designed to engage users and capture their attention for as long as possible. This is all very well but it’s important to recognize that some of the functionality designed to do this may be contributing to increased energy consumption. Take autoplay functionality for example, where a video service will automatically play previews, trailers, or the next episode. During our testing on Smart TV devices, when focusing specifically on trailer autoplay, we found that energy consumption varies depending on the actual content being played. Blades of Glory, when on trailer autoplay, used more power than Dead to Me. This is mainly due to the presence of lighter pixels in Blades of Glory, which increases energy usage. We found that the average energy consumption was between 1W and 18W higher when the autoplay feature was enabled. On the other hand, when testing on a STB (we used Sky Puck), we noticed an approximate increase of 1W in energy usage solely attributed to the trailer autoplay feature, excluding the screen.

From these findings, we can see that implementing trailer autoplay can lead to a notable increase in energy consumption, which may not be desirable from a sustainability perspective. An alternative may be to provide click-to-play videos or offer users the option to enable or disable autoplay settings. If autoplay functionality is utilized, it may be worth considering reducing the resolution and bitrate quality for these videos, which may help to minimize the long-term environmental impact.

Auto shut off, power save features have long been used on TVs as a means to save energy, and are clearly important to incorporate into streaming services. Estimates of the carbon footprint of streaming an hour of video-on-demand are around the 36gCO2 to 55gCO2 mark, and continued streaming when the user is not watching the content should be prevented. By incorporating “Are you still watching?” functionality that averts content from playing when there is no active viewer, unless there is an intervention from the user to confirm they are still watching, a live stream or a binge-watching session can be automatically brought to an end after a set time or number of episodes,

Improving the efficiency of user interactions and streamlining the user journey not only improves UX, but it can also result in faster interactions. This can reduce energy consumption by minimizing the number of steps required and saving time. One potential approach worth considering is facilitating the use of voice control across various devices. By encouraging users to utilize voice commands instead of manual navigation with a remote control, an application’s overall usability could be significantly improved.

Empowering users

It’s generally accepted that in sustainable design, users should be provided with information regarding sustainability, and energy consumption. This can empower users to make more sustainable choices while fostering a sense of understanding and acceptance when changes are implemented. As evidenced in research, offering real-time feedback on energy consumption or energy savings has proven to be highly effective in promoting energy savings. For instance, summarizing energy consumption during a specific time frame can help users grasp their energy usage patterns and encourage more conscious energy behavior.

Implementing these ideas may help to cultivate a greater sense of environmental responsibility, and at the same time could enable users to actively participate in sustainable practices. It’s important to note however, that there are currently certain limitations to the implementation of such features, and it’s not yet fully possible to get reliable data on live energy consumption of the network infrastructure.

Another way to empower users to stream content in a more sustainable way is to offer options allowing content to be streamed more sustainably. And to take it one step further, consideration should also be given to implementing sustainable options as default. When providing users with choices, it is important to recognize that implementing sustainable default options, such as click to play trailers instead of autoplay, can significantly influence user decisions. Similarly, incorporating buttons that allow users to skip intro, skip summary, or play next episode during undesired sequences also empowers users, and can produce energy savings. By providing these options, users have the flexibility to bypass unnecessary content, ultimately reducing the overall time spent watching and, consequently, the energy consumed.

Reducing energy consumption

 When considering the overall energy consumption of an application, numerous critical decisions are already made during the design phase. Efficiency and usability are pivotal concepts that must be thoroughly evaluated in order to create a more sustainable platform. With a more efficient user experience, the time required to accomplish desired tasks can be reduced, which theoretically will result in a decrease in energy consumption. Improved usability and accessibility will also facilitate users’ access to the application, regardless of their circumstances, thereby minimizing user friction.

Accedo’s research has identified several impactful changes that could potentially reduce the energy consumption of video streaming without sacrificing performance or user experience. These include using energy-efficient devices such as smart TVs or STB+TV combinations (rather than TV and game consoles with much higher energy consumption), optimizing codecs such as SVT-AV1 and x.265 (which are low energy consuming without sacrificing quality), and implementing features like disabling autoplaying content, skip intro/summary, and dark mode. Our research has shown that darker pixels in content or a homepage helps dramatically reduce energy consumption compared to lighter content. These changes can be quickly checked by content owners or broadcasters for a sustainability health check, ensuring they are providing energy-efficient options to their viewers, and avoiding unnecessary energy consumption.

While our research has provided valuable insights, there is still a huge amount of work to be done investigating UX changes and improvements that will help with the sustainability of an application or device. While there are a lot of uncertainties around sustainability in video applications, one thing that is definite, is that it’s important that the industry strives to strike a balance between engaging users effectively while also being mindful of energy consumption and promoting a more sustainable approach.

 

 

Live events orchestration: why a flexible CMS solution is integral to business success – Simplestream

Adam Smith, Founder & CEO, Simplestream

What are the biggest challenges in managing live event schedules with content provided by multiple operators? The answer can be rather straightforward, and it comprises several key aspects, mostly related to the pain points platform owners are facing today when distributing content across digital channels.

Firstly, ensuring that the live content ingestion and integration are handled with the right set of tools is the foundation for granting flexibility when delivering multiple outputs. In a workflow that in normal circumstances heavily relies on manual processes, with several teams involved, quality and reliability of content provided by different operators need to be ensured throughout. And ultimately, scheduling and distribution – across different touch points – are a focus for successful platforms. With often large numbers of stakeholders deployed to manage the workflow, a permission-based system becomes necessary to support seamless operation, avoiding jeopardising the overall effectiveness of the platform.

Media Manager for live events orchestration

Can a backend solution with an easy-to-navigate UI make life easier? It certainly can, and that’s the reason why at Simplestream we built our value proposition focusing on the foundation of a flexible, modular content management system, Media Manager. A product that has progressively become more powerful for operators to own the entire workflow for their video content, both on demand and live.

Its most recent evolution, Live Scheduler, is a bespoke enhancement for the built-in module for live events and has marked a new milestone in solving the challenges outlined above. It has adapted Media Manager to an even more flexible architecture, one that can also act as an orchestration layer, fully self-service for the client. It provides an opportunity for platform owners of any size to bring multiple event data management capabilities under the same roof. With additional benefits when it comes to monitoring, scheduling, and permissions for different teams.

 A use case in the telecommunications space

A telco based in the North American region chose Simplestream to enhance an existing set of OTT workflows, including all the origin servers, APIs, and encoders. With no existing UI, and a backend system relying solely on manual processes, the organisation was limited in the number of content partners it could onboard, and looked for a solution that could be cost-effective and ‘self-service’. The objective was to streamline an otherwise cumbersome (and prone to error) process of manually inputting the documentation to correctly set up live schedules for multiple events in sport, but also across faith, news, and entertainment in general.

Thanks to Live Scheduler, every content provider can access an easy-to-navigate platform, operating in a cloud-based, fully agnostic environment. The backend acts as middleware, with custom UI, and the added benefit of reducing the margin of error to a minimum, as well as allowing the client to limitlessly expand the number and scale of the content providers.

The added value naturally sits at the heart of the solution. On one hand, Media Manager acts as a bridge between the source of content and the output of the broadcast. On the other, teams of operators from third-party providers can seamlessly access the backend and see – at a glance – what the scheduling parameters are, ensuring a smooth operation. It was key for the client to be able to open a gateway for its content providers to fully manage their events (from their creation to updates, deletion and monitoring) against a pre-integrated API. The information is subsequently passed through the CDN for the final stages of the process before it gets delivered to IPTV or apps.

A permission-based structure provides the third parties with multiple tiers of access: from user groups auditing, to seamlessly monitoring the health and status of the service, with the ability to roll back to a previously auto-saved version of the schedule in case of mistakes.

On the horizon

Content aggregation is among the most debated topics in the streaming space today. The future might hold something big for platforms looking to simplify and accelerate content onboarding, becoming themselves an aggregator. The aforementioned solution for the orchestration of live events goes perfectly hand in hand with the needs and wants of big telcos, of course, but appeals to media brands or rights holders in the live news, sports, betting, and entertainment spaces too.

Picture this: an operator wants to promote an innovative service that relies on user-generated content to populate daily programming schedules. Media Manager and its Live Scheduler can become a powerful ally for content creators (regardless of their experience) who can access the backend from anywhere, holding basic camera equipment or even just a smartphone with a data connection. The solution would undoubtedly allow them to spin up a solution that’s cost-effective, suitable to be permanent or ‘event-basis’ (i.e. in the case of multi-sport events with a set timeframe, to be shut down as they end).

As a next step in the evolution of such a product, we naturally find virtual channels with optional monetisation opportunities via ad insertion. Simplestream’s Channel Studio can support more innovative approaches to a deeper user experience. With multiple live events happening in one or more days, is there a chance to bring everything together, in the same pre-built playlist of content? The answer, again, is yes. And with the invaluable benefit of content providers adding any video on demand (VOD) asset to fill potential downtime. All in a useful queue system to be approved by the platform operators at a higher level.