IABM Journal 2024 content focus and deadlines

IABM Journal 2024 content focus and deadlines

Here are the main themes and deadlines for the four 2024 editions of the IABM Journal.

Articles submitted should be 800-1000 words plus supporting images, including a headshot of the author where possible.

We will also consider publishing articles outside the main themes of each edition. Opinion articles addressing key current industry issues are in particular always welcome.

Please contact the editor for more details: roger.thornton@theiabm.org

 

March 2024 – Q1

Main theme: Creativity & Orchestration

Editorial focus: We’ll cover tech and activities that boost creativity. This includes both creativity and technical innovation using physical, virtual, or immersive methods. We’ll also look at the impact of AI and orchestration on creative sectors.

Keywords: Creativity, Collaboration, Innovation, Content, Metaverse, Immersive, Imaging, Creation, Production, Culture, Automation, AI etc.

Content chain: Create, Produce

The latest developments that are enhancing creativity, collaboration, and workflows at the front end of the BaM Content Chain®.

Big issue: Orchestration and creativity

How will the next wave of orchestration and automation impact jobs in content creation and production?

Sustainability: what progress is the industry making?

Deadline for finished materials: 15 March

————————————————————————————————–

June 2024 – Q2

Main theme: Resilience & Talent

Editorial focus: We’ll cover tech and activities that boost business and technical resilience. This includes security, business and data protection, geographical diversification etc. We’ll also look at the importance of talent in MediaTech.

Keywords: Security, Business Continuity, Data Privacy, Disruption, Supply Chain, Standardization, Scarcity, Complexity, Interoperability, Talent, Skillsets, Workforce, Diversity etc.

Content Chain: Connect, Store, Support

The latest developments in the activities that enable the BaM Content Chain® – moving/delivering content; storing content throughout its lifecycle; monitoring, testing, communicating and running facilities.

Big issue: Next Gen Talent

How is our industry supporting the drive to attract new talent and increase workplace diversity and inclusion?

Sustainability: what progress is our industry making?

Deadline for finished materials: 8 June

———————————————————————————————————————————–

September 2024 – Q3

Main theme: Efficiency & Sustainability

Editorial focus: We’ll cover tech and activities that boost efficiency. This includes automation, intelligence, supply chain management, cloud-based workflows etc. We’ll also look at sustainability in MediaTech.

Keywords: Speed, Supply Chain, Data, Intelligence, Cloud, SaaS, Pricing, Automation, Insourcing, Outsourcing, End-to-End, Best-of-Breed, CAPEX, OPEX, Sustainability, etc.

Content Chain: Manage, Publish

The latest developments in managing and preparing completed content and its publication, including playout of linear and non-linear content; orchestrating the workflow and resources required.

Big issue: Sustainability

Our home is under threat; what is our industry doing to protect our beautiful but fragile planet?

Deadline for finished materials: 24 August

——————————————————————————————————-

December 2024 – Q4

Main theme: Economics & Democratization

Editorial focus: We’ll cover tech and activities that boost revenue generation. This includes revenue creation and differentiation. We’ll also look at convergence between media, MediaTech and adjacent sectors, facilitated by the democratization of technology.

Keywords: Direct-to-Consumer, Business Models, Economics, Revenue-Generating, R&D Investment, Convergence, Democratization, Parallel Markets, Gaming, Consumer Experiences etc.

Content Chain: Monetize, Consume

The latest developments in managing rights and royalties, scheduling services, advertising, data aggregation and analysis. Enabling consumption of content on consumer-facing devices, apps, and platforms.

Big issue: Democratization

Our industry continues to look outwards to search for new growth. Where will this growth come from?

Sustainability: what progress is our industry making?

Deadline for finished materials: 8 November

——————————————————————————————————

 

Agama – From Reaction to Prevention: How Telcos Can Manage Video Quality Proactively

Agama  – From Reaction to Prevention: How Telcos Can Manage Video Quality Proactively

Timmy Langeveld, Business Development Manager, Agama

As the telecommunications landscape continues to evolve rapidly, telcos face a multitude of challenges. These challenges, if not addressed strategically, can compromise customer satisfaction, inflate operational expenses, and hamper operational efficiency. One of the primary pain points driving these difficulties is the management and delivery of high-quality video experiences to end users. Here, we will look at how a video quality of experience monitoring solution, like that offered by Agama Technologies, can address these critical issues.

The Pressing Challenges of Telcos

In a highly competitive market, telcos are battling on several fronts. Customer satisfaction, as measured by the Net Promoter Score (NPS), is a vital indicator of success. Yet, many telcos struggle to achieve high NPS due to the inconsistent quality of video experiences. Poor video quality has a direct impact on customer satisfaction, leading to a decreased NPS and, ultimately, lower customer retention.

Operational expenditure (OPEX) is another significant concern. High OPEX costs can erode profitability and create a vicious cycle of cost-cutting and service degradation. The costs of managing and troubleshooting video delivery issues, including customer service contacts, truck rolls, and device swaps, account for a significant portion of these expenses.

Moreover, the volume of customer contacts over video issues can become overwhelming. Each contact requires resources, both human and technical, and frequent device swaps and truck rolls only add to the logistical and financial strain. All of these factors contribute to a complex, costly, and less-than-optimal service delivery model.

The Power of Video Quality of Experience Observability

In this challenging scenario, a video quality of experience monitoring solution can be a lifeline. Agama, for example, offers such a solution allowing telcos to gain deep insights into the video delivery chain. The result is a powerful tool for identifying the root causes of customer issues.

At its core, Agama’s solution revolves around the concept of proactive problem-solving. It focuses on identifying and addressing issues before they escalate into larger problems that negatively impact the customer experience. By ensuring a consistent and high-quality video experience, it can significantly improve NPS.

Agama’s solution can also help to lower OPEX costs by reducing the number of video-related customer contacts. Fewer customer contacts mean fewer resources spent on troubleshooting and service calls. Similarly, by proactively identifying and resolving issues, the need for truck rolls and device swaps can be significantly reduced. This not only lowers direct costs but also minimizes disruptions to customers, thereby further increasing customer satisfaction.

 The Strategic Shift

Adopting a solution like Agama’s for monitoring the quality of the video experience represents a strategic shift for telcos. It’s about shifting from being reactive to being proactive, from managing problems to preventing them from happening. This change can result in a more efficient operational model, higher customer satisfaction, and ultimately, a stronger market position.

In conclusion, while telcos face significant challenges in delivering high-quality video experiences, these challenges can be overcome. By investing in a video quality of experience monitoring solution, telcos can address the root causes of their pain points and transform their operations. It’s a forward-thinking approach that promises not only to survive but also to thrive in today’s telco landscape. The future of telecommunications is bright, and with the right tools, telcos can lead the way.

Timmy Langeveld is a video streaming expert with a product management background and a decade of experience in the telecommunications industry. He is Agama’s Business Development Manager, ensuring that our operators get the most out of Agama’s solutions on a daily basis.

Vizrt – Cloud considerations: how to guarantee success when ‘going cloud’

Vizrt – Cloud considerations: how to guarantee success when ‘going cloud’

Jon Raidel, Global Lead of Cloud Live Production, Vizrt

If you’ve been paying attention to the conversation around production rooms, “cloud” is everywhere, especially the benefits. But just because we talk about the benefits of going cloud, it doesn’t mean it’s a one-size-fits-all solution.

Different productions have different needs, from the size of what’s being captured to the size of the team working on it. Not only that but transitioning to cloud can sometimes entail changing from hardware to software and adapting to a new way of doing things.

To make a leap to cloud production and make it truly work for you, some considerations must be made.

The economic challenges of cloud operating platforms
When it comes to traditional live productions, it’s usually the case that hardware will be ‘on’, even when not in use. The cloud offers the possibility of not leaving it ‘on’ for longer than you need for each production. Understanding that the meter is running and turning off the instances when not in use is important to optimize the infrastructure in the cloud.

Being able to turn it off and on again as needed keeps costs in check. There is no doubt cloud production is a cheaper alternative when optimization is achieved.

With the right provider, the process can be made easier, and quick too. Recently, the European League of Football (ELF) decided to move to cloud live production for its 2023 season – in just three weeks. Now who has built a control on-prem from start to finish in that timeline? The available technology makes this possible, by automating the deployment of the tools in minutes, as and when needed, with Viz Now. This makes it easy to turn it off when production is finished.

For example, The European League of Football has its viewership on track to reach more than half a billion households worldwide, representing 17 teams in nine countries. So, when ELF made the switch from pre-recorded, post-produced games to cloud live production, it was paramount to get it right.

At the end of the day, the cost benefits are considerable. Production agency for ELF, novel.media, shared that costs were reduced by 10,000 euros a game – which equates to over 700,000 euros a season.

With increased value for sponsors, reduced production costs, and the ability to operate up to four games at once, ELF is optimizing its chances of reaching more fans and growing.

ELF’s moving to the cloud is only done once, but the benefits continue.

 

The hidden costs of cloud: is it always better?

In my experience, the hidden costs of cloud have come in the form of lack of communication – from both sides.

Vendors can sometimes leave out details during the surface-level planning, which are brought up during the architecture phase. It’s a vendor’s responsibility to try and make it as simple to understand as possible, and broad strokes are good when talking with a customer making a big move.

On the other hand, customers may leave entire engineering or video departments out of the conversation, until the system is built. This means that the workflow that was established doesn’t fit with what the production needs. It’s happened before: an entire process is constructed without the customer’s IT department involved, and when IT comes in the project is stopped in its tracks.

Minor details can seem insignificant, but it’s important to factor in the newness of going cloud. While most broadcasters have experience with traditional on-prem setups, many are new to cloud and will naturally rely more on guidance from the vendor – and this is why it’s so important to flesh out all the ‘minor’ details upfront. This way, customers can avoid overpaying on professional services if it doesn’t work the first time.

Communication is the best tool to avoid hidden costs or mishaps. Understanding what a customer needs, so they can go cloud smoothly and successfully, also comes down to the details – even the ones we might not find that important.

The right partner for the right job

When listening to a vendor making big claims about cloud benefits, some broadcasters can get a little suspicious. I get it – it’s a lot to be promised at once. Lower carbon emissions, lower production costs, and a workflow that allows for multiples productions to be done at once seems too good to be true.

So how can you see clearly through all the marketing and claims?

Researching and educating yourself on how and why going cloud has worked for productions similar to yours can make things clear. A customer is a company’s biggest marketing tool – and stories don’t lie. Search for customers who use the product or solution that you were recommended, and don’t hesitate to reach out to users too.

It’s also true that each vendor is on their own development timeline with cloud, and licensing comes to mind. Every vendor does it differently, not because they are being difficult, but due to a philosophy or infrastructure put in place. A broadcaster may love the idea of using cloud and wants to pay a 12-month licence, another may want to be charged per use. Who’s right? The answer is both! But each vendor is getting to those destinations at different times.

It’s important to work with a vendor that is invested in understanding the specificities of your workflow and production, preferably with a 24/7 highly rated and respected customer success and support team. As a trusted advisor and partner in your journey to cloud, and careful consideration of all the details, combined with the right expertise, and the right technology, the right vendor will guarantee your project succeeds and you benefit.

VisualOn has released its universal content adaptive encoding solution – VisualOn Optimizer

VisualOn has released its universal content adaptive encoding solution – VisualOn Optimizer

Reduces streaming costs and enhances viewing experience without impacting deployed infrastructure

Intigral, the media arm of STC Group and leading provider of digital entertainment and sports solutions in the Middle East and North Africa region, has selected the VisualOn Optimizer to optimize CDN bandwidth and storage cost while maintaining and improving the end-user video quality for VoD network.

“Video streaming can be extremely bandwidth-intensive and how to save the bandwidth cost is a major topic. VisualOn optimizes the bitrate of VoD video content, without reducing video quality. The solution enabled savings up to 70% on bandwidth and CDN storage, while maintaining the video quality,” said Eyad Al Dwaik, Director Engineering Operations at Intigral.

VisualOn Optimizer is the industry’s first Universal CAE solution that can be easily integrated with your current delivery systems, Live or VoD, cloud or on-prem, without changing your transcoders or workflows. It efficiently analyzes the content and determines the best transcoder settings for achieving the target quality:

  • Achieves 40%+ bitrate reduction while maintaining or improving video quality, as measured by VMAF score.
  • Supports VoD workflow with computationally efficient single-pass encoding.
  • Supports Live workflow in real-time with zero added latency, nor CPU load.
  • Is compatible with any software or hardware transcoder; fully compliant with industry standards without interoperability issues.
  • Does not add any addition CPU load, not delay to an existing transcoding farm.

Yang Cai, President & CEO at VisualOn, added: “We are proud to extend our partnership with a global digital entertainment provider such as Intigral to provide a remarkable experience for their end-users across all devices while reducing streaming costs. Reducing delivery costs while maintaining or improving viewing quality is crucial for video delivery service providers to maximize profits in a highly competitive market.”

VisualOn Optimizer is now being experienced by customers worldwide. Please contact us at sales@visualon.com to learn more.

Veset – How cloud playout is helping to ease the complexity of playout

Veset – How cloud playout is helping to ease the complexity of playout

Gatis Gailis, CEO, Veset

 I think most people would agree that we have seen more change in the broadcast industry over the past couple of years than we have for a long time. The accelerated shift to the cloud, transition to more ad-funded services than ever before, coupled with an evolution in consumption trends, are all having an impact throughout the entire industry, changing the way content is produced, managed, and distributed.

This is also causing significant challenges and complexities specifically for playout for a number of reasons.

 Not all playout is cloud-native

It is perhaps because the transition to the cloud was drastically accelerated that many broadcasters have moved to a partial level of cloud playout. A quick fix approach made a lot of sense in the interim for getting cloud-based access and workflows in place fast to ensure broadcast could continue in uncertain times. However, many broadcasters are finding that their current hybrid cloud workflows are not yet meeting their needs.

Many of these instances are virtualized, allowing remote access to on-premise hardware. This has been valuable for broadcasters, enabling a gradual shift to the cloud. However, some broadcasters are finding they are restrictive. In some cases, this is because they haven’t been seamlessly integrated into existing parts of the workflow, leaving a clunky workflow.

Virtualization allows broadcasters to access on-premise hardware remotely, which can be a valuable way to transition to the cloud. However, some broadcasters are finding that virtualized solutions are restrictive. This is because they can be difficult to integrate with existing workflows, and they may not be able to fully maximize the potential of the cloud.

Cloud-native solutions, on the other hand, are designed to be scalable and flexible. They use microservices and deployment tools such as Amazon ECS, Istio, and Kubernetes to enable automatic scaling to meet demand and high availability. This makes them a better option for broadcasters who need to deliver high-quality content to a wide audience.

Lower spend is forcing more broadcasters to go ad-funded

 It is becoming harder for broadcasters to continue with just subscription models. As consumers continue to cut spending, shifting to an ad-funded approach has become a necessity for many to retain viewers. However, adding ads into your playout comes with a new layer of complexity for a number of reasons:

Allowing for regional and platform variations

One of the biggest reasons FAST adoption has been slower in Europe is the massive variations between regions makes it less cost-effective than regions where variations are fewer. This can come down to a variety of variations that need to be created based on language, compliance for different regulations, and even preferences for example for different actors, music tastes etc. Ensuring the ads are appropriate for each region requires a great deal of, often manual, effort to review each ad and make necessary adjustments.

In some cases, this can also change depending on the platform. For example, some ads may be ok to display in a traditional broadcast environment, but may not be cleared for OTT.

Managing these variations requires the processes, rigorous checks, and the presence of metadata and SCTE markers to make sure the right ads are always displayed for the right region and platform. Getting those in place properly should however make the process relatively smooth.

Personalization is key to boosting revenue

We all know that personalization is extremely beneficial in broadcast, helping consumers get quick and easy access to the content they are most likely to engage with. The same is of course true with ads, not only making consumers more likely to respond to ads they are served, but also greatly increasing value for advertisers. With so many ad-funded services launching, being able to attract advertisers is in itself challenging, meaning broadcasters can likely not afford to skip ad personalization.

However, in a playout environment, this is not always easy, especially for broadcasters battling with a hybrid cloud environment. Broadcasters are increasingly looking for ways to build channels and insert personalized ads, entirely in the cloud. They need to be able to create rules so that they can push personalized ads based on a number of viewer metrics.

Ensuring the right amount and precise timing can be a challenge

How many ads is too many? The answer depends on the type of ad, the type of channel content, and the intended audience. Some audiences are much more receptive to ads than others and consumers are generally more receptive if it gives them free access to premium content, such as live sporting action or blockbuster movies, for example.

At the same time, even the fundamentals of ensuring precise timing can be challenging for playout and getting it wrong leaves you with gaps or overlapping content, neither of which is optimal. Using SCTE markers and ensuring the right level of metadata is the key to resolving this.

Consumption habits change fast
With every new generation come new consumption habits, but they seem to be forever changing. It has been well documented that younger audiences are searching for shorter, more snackable content, but are consuming lots of it. How can broadcasters delivering linear channels keep up with those trends to ensure they are building channels of relevant and engaging content? Again, this comes down to knowing your audience first. Having that understanding of the user profile and the types of content most likely to appeal is the first step in being able to serve the right content choices to keep them engaged. It is also important to continually evaluate so that your channel offering can change as habits change.

Keeping channels fresh means you need maximum flexibility to amend programme schedules quickly and easily as those needs change. This is much easier to accomplish for those broadcasters already hosting their playout fully in the cloud.

Navigating playout complexities

As with many parts of the broadcast workflow, playout has become more complex over recent years. Broadcasters are navigating a number of challenges while working to retain viewers and add value to advertisers in an increasingly crowded marketplace. However, it doesn’t need to be that way. There is a plethora of tools out there designed specifically for making it much simpler to create and distribute linear channels, from managing the schedule, inserting, and targeting ads, to distributing them wherever they will be watched. At the same time, moving to a fully cloud-native workflow makes it even easier for broadcasters to scale quickly, adapt their channels, and serve varied content across multiple platforms and regions.

At the same time, moving to a fully cloud-native workflow makes it even easier for broadcasters not only to be more agile: scale quickly, adapt their channels, and serve varied content across multiple platforms and regions, but even more – to  become more profitable.

 

 

Veritone – So, Your Content Is Finished…Now What?

Veritone – So, Your Content Is Finished…Now What?

AI’s influence in media: managing and monetizing completed content

Paul Cramer, Managing Director, Broadcast Solutions at Veritone, Inc.

Media has come a long way from its traditional production journey. The advent of artificial intelligence (AI) has revolutionized the previously linear path of content production, transforming the process by creating new efficiencies and allowing content to have a second life beyond its initial creation and broadcast.

With AI’s robust capabilities in tagging, managing, and preparing content, production teams can now maximize content usage while optimizing resources, creating a more reliable flow of content even in times of high demand or disruption. In this article, I’ll delve into the evolving media ecosystem, highlighting the role of AI in content management, monetization, and the industry’s future.

The evolution of media

In the past, completed content was confined to one-time use, primarily destined for theaters or television. It would then be confined to physical archives or trotted out for reruns or home video. Later, with the advent of technologies such as the cloud and streaming, content could more easily get a second life. But this process often neglected underused content that rights holders may not have known was in their archives. In addition, it neglected additional footage, from alternate takes to B-roll footage, that can have a myriad of uses for those who license content.

Now, with the integration of AI, the workflow has shifted to a more dynamic process. Content can be mined from archives by automating the tedious, resource-heavy, and error-prone process of metadata tagging and management. So, not only can content be tagged with relevant metadata at the point of ingest, older, archival content and unused footage can be surfaced via that same technology, opening a myriad of new possibilities.

The changing landscape of content management

Traditionally, content creation was constrained by limited use, hindering its potential reach and impact. However, with AI’s ascension, media can now leverage completed content in a variety of additional ways and formats, reaching broader audiences, creating better customer experiences, and generating more business value.

By leveraging AI-powered digital asset management and creative tools, production teams can optimize their content usage by strategically recycling, renewing, or recreating it. The ability to localize content further increases its appeal to diverse audiences, initiating the cycle anew in a managed and monetized manner.

And with this increased level of optimization, teams can focus on the creative aspects of content production as well as distribution strategy, enhancing overall productivity and content revenue—and preventing unnecessary burnout.

Another example of how AI maximizes content usage can be seen through the automation of promos to a larger scale with the help of generative AI. Generative AI technology can automate or streamline the creation of promos, trailers, and teasers, helping production teams to create these assets on a larger scale while also catering to diverse platforms and audiences, and enhancing the content’s promotional reach.

Much like content creation, the landscape of content management has changed drastically from its traditional workflow, making way for a new avenue for keeping up with modern content demands and creating more opportunities for accessing and repurposing content in the future.

How AI-powered workflows play out in real life

It’s easy to get lost in the hypothetical when it comes to incorporating AI into content workflows, so it’s helpful to look at some high-profile use cases. One success story follows the San Francisco Giants and the club’s legacy project, which was initiated in 2020 in an effort to share content with fans during pandemic lockdowns, a time when little to no new baseball content was being produced.

With the help of an AI-powered digital asset management tool, the Giants were able to automate the process of tagging recently digitized analog content from the past 60 years—a process the club estimated would take 15 interns an entire year to complete. This digital asset management tool enabled the team to quickly tag, manage and access all of its branded content in a fraction of the time, allowing the club’s production team to start creating content for fans during a time it was needed most.

Monetizing completed content

The rise of streaming platforms has breathed new life into old content, giving it a second chance to reach audiences. Streaming’s global reach and on-demand nature have provided a new revenue stream for content creators and distributors—one that has surged in popularity and need since 2020. But with that increased demand comes the need to better organize, discover, and monetize a greater cross-section of content archives than ever before.

Disruptions such as the COVID-19 pandemic and the Writers Guild of America strike have caused major setbacks in content production and distribution. However, AI technologies have offered innovative solutions to address these challenges, ensuring continuity in content delivery.

AI-driven automation and remote collaboration tools have allowed production teams to continue their work despite physical barriers, showcasing AI’s resilience in the face of unforeseen obstacles.

In an era in which content can have a prolonged life cycle, strategic planning becomes essential. Content creators must focus not only on old content and reruns but also explore their archives for unreleased or unseen material that can still captivate audiences while providing new monetization opportunities. By thoughtfully planning the release and licensing of content, creators can maximize revenue and audience engagement, transforming seemingly outdated content into valuable assets.

Unlocking hidden gems: leveraging archived footage

Content archives can hold hidden gems that have never been seen or released before. By tapping into these archives and curating fresh content, broadcasters can provide audiences with unique and exciting viewing experiences.

Some options include:

  • Legacy footage: As mentioned in the San Francisco Giants case study, archived footage can be accessed to create legacy projects, introducing past footage to new audiences.
  • Licensing opportunities: Once content goes live, licensing it for distribution offers additional revenue streams. AI-driven platforms can assist in identifying potential licensing opportunities and efficiently managing licensing agreements.
  • Creating a content marketplace: Establishing a marketplace for completed content, clips, and B-roll footage allows content creators to monetize assets that might not have been used in the primary production. AI can help manage and organize these resources, ensuring a smooth exchange between content buyers and sellers.

However, in order for content archives to work in an organization’s favor, the content must be tagged well and made easily searchable and accessible.

Ensuring efficient content retrieval

The success of content management relies on effective archiving systems. AI-driven tools enable intelligent content tagging and categorization, simplifying content retrieval and usage. In a fast-paced media landscape, quick content retrieval is crucial. AI’s ability to search and locate specific footage within vast archives reduces downtime and maximizes productivity.

While AI presents immense possibilities, it is imperative to ensure that the solutions implemented are user-friendly and operable. Streamlining AI integration ensures that production teams can fully leverage AI’s potential without facing unnecessary complexities.

Closing thoughts

The evolution of media with the integration of AI has resulted in a transformative shift in content management and distribution. AI’s capabilities in optimizing resources, automating processes, and ensuring efficient content retrieval have reshaped the industry.

As AI continues to advance, it will play an increasingly critical role in the media industry. AI-powered systems will likely become more sophisticated, enhancing content discovery, audience engagement, and revenue generation.

The synergy between AI and media has unlocked endless possibilities, enabling content creators to reimagine their approach to production, distribution, and monetization. By leveraging AI technologies effectively, production teams can navigate the ever-evolving media landscape and deliver captivating content to audiences worldwide.

Varnish Software – Making streaming more sustainable: three effective methods for achieving more with less

Varnish Software – Making streaming more sustainable: three effective methods for achieving more with less

Adrian Herrera, CMO, Varnish Software

Streaming might be our favorite pastime, but beneath the surface, it’s a colossal energy-guzzling process that’s taking a toll on our planet.

Today, the average consumer worldwide spends about 19 hours a week streaming video – but this can be much, much higher for some. And with a population of more than 742,200,000, Europeans could have streamed more than 735 billion hours – or 83 million years – of content in 2022 alone!

To put this into perspective, every hour of video streamed emits roughly 55g of CO2e. This would mean that Europeans streaming habits account to roughly 40.4 million metric tons of CO2e in just one year – the equivalent of driving 210 billion km, given the average gas-powered car emits 192g of CO2e per km.

The truth is, every hour of video streamed pushes a vast amount of tech and hardware into overdrive – data centers, servers, networking gear – you name it. This means that, as our time spent streaming grows, so does our carbon footprint. What’s more, video files are becoming increasingly data-heavy, meaning the amount of energy – and additional hardware – demanded for streaming could grow sizably without corrective action.

While consumers want increasingly data rich experiences, and won’t tolerate subpar quality, they also want to support the environment. The industry is taking notice, with virtually every big conference or major event, like IBC, now aiming their sights to address the streaming sustainability crisis at hand.

Winning “the streaming wars” is no longer just about content – it’s about sustainability and conservation too. To address this, CSPs, video service providers and streamers must implement more energy-efficient methods of streaming, without sacrificing viewing quality. Here are three ways to do so.

Reduce hardware needs with caching

With the 4k market growing at a CAGR of 21.7%,  the technology has finally become commonplace, and many are already strategizing on how to support wider use cases of 8K and beyond. Yet, streaming 4k alone requires a bit rate of about 5 times more Mbps than HD.

With more robust resolutions and experiences come greater challenges, especially when providers must scale content delivery at a dime’s notice to meet the needs of thousands, if not millions, of viewers tuning in – from different locations and types of devices.

As expectations for more robust experiences increase, consumer expectations for zero downtime and near-zero latency will remain the same. Therefore, many throw hardware at the prospect of delivering higher quality content, piling on additional servers as a solution to rapidly escalating loads. This doesn’t work long term.

More hardware requires more energy – equating not only to more carbon emissions, but also a larger power bill that will increase exponentially if inefficiencies aren’t tackled at the root.

Rather than spending more on unnecessary hardware, many organizations should first focus on optimizing their legacy equipment to get the most throughput possible out of every watt consumed. Caching is one way to do this.

Caching is essentially a form of digital recycling, which involves reusing previously fetched or computed data for future requests instead of sending the same requests back to an origin server miles away. This reduces unnecessary processing, energy usage, and strain on network resources.

Strategic approaches and technologies for content delivery can drastically enhance throughput and offer more sophisticated control over caching.

For instance, many organizations can utilize free, open source or pre-packaged tools to specify an ideal time-to-live for cached items, or efficiently update and purge items in a cache rapidly as content is modified. Fine tuning traffic routing and load balancing can also ensure content is always delivered from the best location possible.

Migrate to the edge

As data travels farther distances, it consumes more energy and will be delivered with more latency. Therefore, content delivery at the edge is becoming more vital to optimize power usage, performance and costs.

By utilizing the edge, data can be stored and processed closer to consumers, reducing the distance data travels across networks, while adding scalability to content delivery. Utilizing the edge properly via caching also reduces dependence on core data centers and other energy-consuming infrastructure.

As 5G adoption increases, edge-based content delivery solutions will continue to evolve to meet complex requirements on IoT devices and other applications. For example, initiatives like the EBU 5G-EMERGE project are combining satellite communications with advanced web caching and content delivery at the edge. Unique approaches like this can provide greater capacity, scalability and reach, and have the potential to be cleaner than current distribution methods; however, measurement would be needed to prove any benefits.

Use software to optimize

 Maximizing throughput per single server is the key to delivering more with less.

Updating to the latest hardware can help reach this goal, but that is costly and may not perform as expected if it is matched with the wrong architecture. Moreover, replacing hardware may result in unnecessary e-waste from discarded equipment, and it is not a scalable approach.

Software-defined solutions are almost always worth exploring to optimize new and old systems for the best Input/Output possible. Software can also be implemented quickly with relatively less upfront investments, improving energy efficiency today and performance long term.

Organizations all have a different combination of infrastructure, which could consist of bare metal, containerized, virtualized or cloud environments that all require a different approach. Therefore, software stacks must be optimized and packaged for the exact needs of the infrastructure at hand, or the use cases involved, such as streaming video live or on-demand, web and API acceleration or private CDN deployments.

For example, when organizations stream video, they must usually process large amounts of data that runs through cloud, data centers and other multiprocessor systems. In these instances, utilizing non-uniform memory access (NUMA) can allow multiple processors to use the same I/O resources and memory locally. This can provide many benefits, including reduced data replication needs, faster data flow with less latency, and more scalable and responsive architecture overall.

NUMA is just one example of a plethora of ways in which software can help extend the lifespan and efficiency of any hardware.

 Making streaming greener

At this stage, it’s still difficult to fully quantify the environmental impacts of streaming. But, as digital media continues to evolve, it will become more important than ever to ensure the technology powering and delivering our favorite content is more sustainable.

The industry recognizes this and is already in action. Organizations, like the Greening of Streaming, are already bringing global industry players of all sizes in media, government and technology together to develop common methods of measuring and addressing the impact of streaming on the environment.

Streaming businesses have a clear-cut mission at hand: review their existing systems and design a blueprint to accomplish greater efficiency with fewer resources. The future of streaming relies on their ability to turn the tide.

 

Three Media – Driving towards business agility

Three Media – Driving towards business agility

David Radoczy, COO, Three Media

I think it is safe to start from the assumption that every media business is moving from a smokestack approach – a production line of bespoke, application specific devices – to a software-defined, cloud smart architecture. This will include large elements of intelligent automation, eliminating the mundane to let people concentrate on where they generate real business value.

We are all more or less pleased by this.

But the move to IP and all it enables from seamless, global connectivity, easy integration and virtualised platforms is not the whole opportunity, it is an enabler. Business agility depends on fresh thinking and a change in the DNA of so many of our operating assumptions. This is much more than just another technology refresh. It is about how to manage and continually evolve the critical operating parts of your business:  technology, operations, vendor relations, financial structures and customer offerings; not as unique independent concerns but five interdependent and highly connected levers.

The shift from capex to opex is frequently cited as one of the major changes that the software revolution has brought about. The benefit here is not just how CFO’s treat the expense but consider in-sourcing or outsourcing, hosting options and infrastructure, software maintenance and new levels of vendor engagement this can allow.  Then consider how this is all bundled up for a customer proposal and the flexibility and service options that are then available.  You can get the functionality and capacity you need, when you need it and only pay for what you use. You can scale up or down as required – a virtualised operating model.

A simple example, a post house might in the past standardise on one vendor’s edit software. Now, if three different edit applications are appropriate for the productions going through your facility, then license those three applications. Turn them on and off when you need to. It is as easy as that if you’ve shifted your operating structures and have your core services in place, geared for an IP and cloud enabled operating model. Consider the right orchestration service and data flow design and management – it is not traditional, but it is real operating agility.

Self service

The nature of legacy technology made agility impossible. You need an SDI router; you identify the vendors who have the right number of inputs and outputs and resilience. Then you forget them for years until you need a new router or need to expand. There is no opportunity to scale back and cut costs or commitment.

In this new world of picking applications as we need them, the vendor relationship is transformed. At home, we are perfectly comfortable with a self-service model. If we need an application, we find it, download it and license it.

We can do this in the professional environment, too. Plenty of vendors are now offering flexible licensing, which is the foundation of business agility but in a commercial and operating context, you have to think slightly differently.

Imagine winning the contract to post produce a nature series, to be shot over two years in multiple locations in four seasons. You need a powerful production asset management system, so you identify suitable software, download and pay for it. You will need technical expertise, but do you employ staff or take advantage of the vendor’s professional services; consider this across multiple clients, each with a tailored solution.  Consider scaling from five edit seats this week and 500 next.  How do you redefine your business and key value proposition?

The relationship between vendor and media company has to become agile. Vendors have to shift from securing large contracts to making functionality available through self-service portals, and providing reassurance that software tools will be supported in new and innovative ways on terms that help us all grow.

This isn’t revolutionary. Vendors are already realising that the future lies in core platforms and flexible licensing, but it is an operating approach that needs to be pushed to the boundaries of what it can enable.

All these opportunities are out there, ready to be exploited, today. What is the right operational and commercial model for your business, for your productions, for your clients? Defining that means a lot of lateral thinking, and a conscious moving away from “the way we’ve always done it”. It means continual reinvention and to compete at the leading edge of our industry.

“The cloud”

Over the last few years there has been much talk about “the cloud”. The result is a mindset that every facility will hand its technology platform over to AWS or Azure, in exchange for a hefty subscription.

That is not the best way to think about it. The underlying tenet of a cloud-enabled operation is not about where it is hosted. An IP-connected, software-defined media architecture can exist on hardware in your machine room or the public cloud.

The point of this new connectivity and architecture is that you connect into the content and functionality wherever and whenever you need it. The technology, services and systems needed are being commoditised, and pricing structures and solutions are being transformed.

To be truly agile you need to define a technology stack where you invest in core, critical, business defining technologies – the rest is treated as a commodity. If you are a cloud-enabled business don’t allow how you have done it, to define how you are going to do it.  Don’t just migrate, blue-sky your optimal operating solution and transform your workflows and processes and leverage the cloud opportunity.

Consider web3 (as we will web4 and web5) and the new service principles it introduces – decentralised architectures, data ownership and control and ‘trustless’ web exchanges. This is another game-changer but if you only see the one-off cost of a server against a monthly subscription then you may be missing something your competitors don’t.

Consider a European linear and OTT broadcaster, buying a US drama series. Rights negotiations are complex, settlement and payments unfold over time and content is delivered on tape and facilities need to comply with onerous security protocols. The broadcaster will store their copy with all the added complexity and cost of media management.

With web3, the broadcaster would never need to host the content as subscribers access it from the content producers web3 storage cloud as does the broadcaster.  Access is via a token, an NFT and the contract is retained in a blockchain where all parties have access and it is where all rights windows, runs and viewing figures are consolidated. Payments are tracked and access is denied once the terms of the licence expire and opened when renewed. Scratch the surface a little deeper and new operating model opportunities and commercial benefits start to gain critical mass and grow.

These are just some glimpses of the way that media businesses can be transformed by seizing on the inherent agility of the technology that is available today. The technology is just the enabler. Becoming, and staying, a successful media business depends upon agility, on finding the best, the most efficient way to serve your clients and audiences. The technology is ready: to seize the opportunities we just have to think different.