Planetcast International – Looking to the cloud for playout disaster recovery that makes sense

Venugopal Lyengar

COO Digital, Planetcast International


A number of headlines in the press over the last year have spotlighted the dangers of playing fast and loose with video playout Disaster Recovery (DR) protection. But providing the protection broadcasters and other media services need is not a simple – or inexpensive – task.

Video content today is delivered through playout platforms across DTV, cable, satellite and, increasingly, over-the-top (OTT) services to billions of viewers. The requirements and, increasingly, outright demands of audiences using an array of devices and platforms mean that modern playout systems' sophistication is evolving rapidly. Today, playout is a multi-staged process linking a number of systems, including media asset management (MAM) for storing and preparing content, playout core and distribution platforms.

These connected systems handle various stages, including compression, format conversion and layering of dynamic overlay content, to deliver engaging viewer experiences with the expectation of unparalleled reliability. They are the bedrock on which modern broadcast and linear over-the-top (OTT) channels have built their ability to reach and maintain their audiences.

Disruption is not always avoidable

Despite high levels of resiliency, it is a fact of life that playout systems fail. Disasters such as fire, flood, or the accidental cutting of power or critical cabling can and do cause outages.

For instance, in October 2021, multiple UK channels suffered major disruptions in playout services due to a fire-related incident at a central data centre. The outages ranged from relatively minor issues – such as loss of voiceover and graphics on certain segments – to complete loss of programming, advertising and related digital content. The outages' duration depended on the various services' DR capabilities. This downtime ranged from a few minutes to around four hours in the worst cases.

However, even 10 days after the outage, several channels were unable to restore access services such as subtitling – and still experienced issues with video and audio quality. And outages are not just restricted to DTV. For example, in December 2021, one of the UK's largest pay-TV operators experienced a loss of all of its TV channels, impacting viewers for hours.

The problem is that even with the extreme levels of resiliency that broadcasters use for playout, disasters – whether natural, accidental or technical in origin – can simply overwhelm a playout facility. The reality is that even near-perfect 99.99% ("four nines") playout reliability means a statistical average of about an hour of downtime a year. Even where there is a disaster recovery plan in place, often DR services aren't tested regularly, so when needed, they sometimes don't work as expected – if at all.

In addition, the sheer cost of creating a fully redundant playout infrastructure is prohibitive for many broadcasters, many of which operate in a highly cost-competitive environment. However, when the worst-case scenario occurs, the cost in direct revenue loss through advertising, severe reputational and brand damage, plus failure to meet public service obligations – required by government regulation in some cases – can be devastating.

The unfortunate truth is that no matter how skilled TV staff are, occasional TV broadcast outages are inevitable without truly independent, ready-to-activate disaster recovery in place. It is not a case of if but when…

The benefit of cloud-based DR

So how can media companies protect themselves in a way that makes business sense? On the surface, the solution is simple: deploy DR for playout and test it frequently across a number of likely failure scenarios to ensure it works.

Traditionally, a full active/active on-premise DR set-up that has replicated the playout infrastructure – running in another geographic location ideally – has been one way to ensure resiliency. If you lose one site, then all of the workload moves to the other site.

As discussed, the drawback has always been cost – especially for smaller and Tier 2 and 3 broadcasters. But drill down to the details and it becomes clear that finding a DR solution is more critical than ever. In the past, a cost-benefit assessment could rationalise that losing a projected one hour a year of DTV to local TV audiences could be deemed acceptable. However, the increasing complexity of playout plus the growth of linked services such as OTT and international distribution agreements means that the impact and cost of such events has grown exponentially.

At the same time, over the last decade, advances in technology and virtualisation mean that, today, it is possible to deliver playout directly from the cloud. Media companies are increasingly moving parts of their playout to the cloud – non-core channels for national broadcasters, for instance – to give them more flexibility and rapid scalability for extended coverage of sports and entertainment events.

It is also feasible to create a complete playout solution in the cloud – with full content replication ready to spring into action in the event of a primary playout failure.  In other words, thanks to advanced cloud playout technologies, it is now possible to offer a broadcast-grade DR solution with minimal recurring costs when the service is not activated. In simple terms, a complete playout DR solution is deployed and managed for a negligible fee until it is activated.

How does Cloud-based playout disaster recovery work?

A cloud playout DR solution includes three core elements as it is operated by Planetcast International. The first is a cloud-based MAM system that is always on and resides within the cloud that receives programme schedules and content. This element is linked to the second, which is a full-featured cloud-based playout platform. When activated, this playout sends content to the third element, a secure internet stream delivery service, as well as a broadcaster's distribution provider – or redirects content to an alternate distribution provider if needed. The result is a complete and highly resilient service.

Perhaps, most crucially, if a provider offers a "pay-as-you-use" cloud DR solution, it has virtually no running cost when not in its playout mode. The full cost of cloud playout should only kick in when activated. The cloud DR system should be integrated as part of an 'out-of-line' deployment that requires no changes to existing playout systems or workflow. This approach is crucial to streamlining deployment and means that the system is truly independent of the existing playout chain – making the backup feed more resilient to related failures.

What’s more, this cloud-based approach enables media companies to scale up DR as they grow and add new channels and services that also need to be protected. Lastly, the solution should be spun up monthly to do an active playout resiliency test to ensure it is able to deliver services as expected.

Taking action using a proven provider

It is essential to recognise that this approach to playout and disaster recovery works best when it uses proven and established cloud-based platforms, and each deployment is delivered based on each client's unique requirements. The disaster provider has to have the expertise, experience and depth of technology and staff to ensure it can do that – and keep doing it consistently and well, even as the client's systems and business models evolve.

It is vital to remember that a cloud DR programme needs to be separate, secure and independent, whether it is to protect an onsite playout environment or supplement an existing managed playout service. That way, it will keep broadcasters on-air if the worst-case scenario should occur.

It is also worth noting that the shift to IP infrastructure is fundamental to enabling effective disaster recovery today. If your infrastructure is SDI-based, you can't virtualise or design a cost-effective multi-facility architecture, let alone harness the cloud-based workflows required to achieve a best-in-class disaster recovery position.

Finally, media brands should view the adoption of cloud-based DR as part of a broader, ongoing journey of the whole business toward the cloud. For use cases such as pop-up and language-variant channels, cloud-based playout often makes more financial sense than heavy investment in on-premise alternatives. The flexibility and scalability of cloud-based playout not only helps enable cost-effective DR and new services but, as part of a broader hybrid architecture, it improves overall business agility.

To download the complete solution paper from Planetcast International on this topic, entitled 'Resetting the economics of media disaster recovery', click here.

Temple Bright LLP – AI and Intellectual Property – the Latest UK and EU Law

Jeremy Morton, 

Partner, Temple Bright LLP


Emotional Intelligence

AI and Intellectual Property – The Latest UK and EU Law

Jeremy advises UK and international business clients on brand strategy, intellectual property rights and disputes, and data protection law, at UK law firm Temple Bright.

AI can write music, understand natural language, analyse vast data lakes and make inventions. Legislators and policy makers around the world are starting to grapple with what this means for copyright, patents and other intellectual property, and for developers and users of AI systems.

At the heart of these developments lies the adaptive and autonomous nature of AI. ‘Adaptive’ means that AI needs no pre-programmed instructions: it learns as it goes along, based on training data, and its subsequent methods are not always transparent. ‘Autonomous’ means that AI can execute decisions without human command and control.

Access to Data for Training

Data for AI training includes text, images and other content likely to be protected by copyright and (in Europe) database rights. Loading these into computer memory requires permission from underlying rights holders and may be a breach of access terms. That is so whether data have been scraped from web pages, or licensed to subscribers. In the latter case, the licence does not necessarily allow uploading for AI training purposes.

Currently, in the UK, there are limited exceptions to copyright where copying is for non-commercial research purposes, and no exception at all for database right (which protects non-copyright collections of data). Even the express copyright exception is unhelpful for AI training, because that will usually be commercial.

EU law deals more generously with this kind of ‘text and data mining’. Commercial mining is allowed, as an exception to both copyright and database right, although the organisation must already have lawful access to the material (for example, under a subscription arrangement). Rights holders can exclude materials from access, except in the case of research organisations or cultural heritage institutions conducting scientific research.

A recently proposed EU Data Act also makes clear that data produced from operation of machinery of any kind will not be protected by database right, and must be made available to aftermarket service providers on fair, reasonable and non-discriminatory terms.

To redress the imbalance in the UK, the government recently finished a consultation exercise, concluding that a new exception for text and data mining should be introduced. The new exception will cover both copyright and database right, and apply for any purposes. Rights holders will not be able to opt out. Having said that, there will be a requirement for lawful access to the material in the first place, so rights holders can choose not to make their content available either at all, or unless a fee is paid.

Outputs of AI – Creative Content

In 2011, photographer David Slater set up a camera in the Indonesian jungle, a monkey took a great selfie that went viral, and an unusually exotic copyright dispute flared up. Fast forward to the age of AI and we see similar issues.

Under European law, necessary originality for copyright purposes can come from making creative choices (photographic subject-matter, angle, lighting, and so on) or just ‘being in the right place at the right time’, and it does not matter who (or what) pressed the button. On the other hand, mere ownership of the camera (or other equipment) does not confer ownership of copyright, unless an applicable contract says so.

UK copyright law has long recognised that computer-generated works with no human author can attract copyright protection. This contrives the ‘author’ or ‘designer’ to be the person who made the necessary arrangements for the work to be created. ‘Person’ here includes a corporate entity, and is quite likely to be different from the person designing the software or system.

Computer-generated works have to be distinguished from works made using a computer system as an aid, where the human is indeed the true author or designer. These distinctions will not always be straightforward when it comes to using AI systems for creative projects, however.

Outside the UK, most countries view computer-generated copyright works as contrary to the essential principle of ‘originality’. As a result, they are much less likely to be protected by copyright. An example is Dr Stephen Thaler’s failed attempt to register US copyright for a work authored solely by the ‘Creativity Machine’ AI system. (The relevant USPTO guidelines also, incidentally, clarify that works supposedly created by divine or supernatural beings will be refused…)

The UK government has announced that our current protections for computer-generated works incentivise investment in AI, and there are no plans to make changes.

Deepfakes and Marketplaces

AI that controls presentation of sales offers, via online marketplaces or IoT, could have an impact on trade mark law constructs such as the ‘average consumer’ and ‘likelihood of confusion’. These are complex points, and the government feels that AI is not yet developed enough to have a meaningful impact in this area.

AI also opens the possibility for simulated likenesses of deceased or retired performers, and false attribution of speech and actions to non-consenting individuals. The UK government has not drawn clear conclusions on how to deal with this, although it says it may not be best left to intellectual property laws to resolve.

Inventions by AI

Dr Thaler (see Outputs of AI, above) is also famous for attempting to obtain patent protection for AI-generated inventions, with the AI system named as sole inventor. DABUS (Device for the Autonomous Bootstrapping of Unified Sentience), an AI system, autonomously developed inventions that included a container lid and a warning light.

No patent office has denied that, if Dr Thaler had filed the patent application naming himself as inventor and applicant, all would have been fine. In other words, there is nothing preventing the patenting of inventions made by humans using AI tools. What they rejected was the suggestion that an AI system could properly be cited as sole inventor, and that Dr Thaler could then claim title merely through owning the system that, he said, made the invention without his involvement. Appeal courts in England, the EU, the US and Australia all took a similar view.

Whilst the outcome was perhaps not a surprise to many patent lawyers, there remain some difficulties in fitting inventions by AI systems into patent law. For example, what kind of involvement in the activity of the AI system should entitle a business to file a patent application in its name? In practice, where multiple entities are involved, this should be dealt with by contract. Secondly, if a system would inevitably have produced that output, is it actually inventive at all?

The UK government recently published its conclusions on patents for AI-generated inventions, following public consultation. There was concern that a proliferation of AI-generated inventions, concentrated among a few dominant industry players, could disadvantage SMEs. The government saw no need to change the requirement for a human inventor to be named on patents. It also considered, but rejected, expansion of ‘inventors’ to include those who perform programming, input data or select outputs based on commercial value. Most respondents to the consultation felt that any changes in this area will need to be harmonised internationally, and that AI was not yet developed enough to make a real impact on the concept of inventorship. No doubt this will be reviewed in the future.

 

WorldCast Systems – Rethinking FM Broadcast to Reduce Carbon Footprint and Drive Cost Savings

David Houze, Product Manager 

Product Manager - Worldcast Systems


Global CO2 emissions from energy combustion and industrial processes rebounded in 2021 to reach their highest ever annual level according to a March 2022 report from the International Energy Agency. In total, more than 36 gigatons of emissions were pushed over the air in 2021, a 6% increase compared to 2020.

Combined with worldwide increases in electricity costs, sustainable energy has become one of the greatest challenges facing radio broadcasters.

In the broadcasting chain, the transmitter represents the most impactful equipment as it continuously delivers a fixed output power to the antenna. In the FM chain, transmitters go from a few watts to dozens of kilowatts depending on the coverage area, landscape and radio listeners’ profile.

After years of innovation, the new generation of FM transmitters integrates state-of-the-art technologies—such as the most recent LDMOS generation with up to 85% efficiency or the new high-efficiency PSUs. Combined with embedded features (RDS encoder, sound processor, stereo encoder, audio over IP decoder), efficiency has increased up to 76% for the most efficient transmitters. But the physical component optimization is almost reached; efficiency is not going to increase anymore.

To reduce FM transmitters’ energy consumption, it is necessary to rethink the concept of FM broadcasting based on the incredible performances of transmitters and receivers.

A common misconception is to think that the technical objective of an FM broadcast chain is to transmit at a specific output power in order to cover a service area, but the real objective is to deliver a high-quality and constant listening comfort for listeners over the entire service area.

Listening comfort can be summarized as the signal-to-noise ratio. Is the listener disturbed by the noise? Listeners’ audio perception of the same noise ratio varies depending on the type of audio content. For example, with speech programs, the slightest disturbance will have a direct, negative impact on listening comfort. However, with highly processed music (covering the full audio spectrum), the noise will be easily covered by the signal itself. It is then possible to slightly reduce the signal-to-noise ratio without impacting audio perception.


WorldCast developed its SmartFM algorithm based on this concept. A psychoanalysis algorithm qualifies the robustness of audio content to perturbations. Then, when the signal is robust enough, the artificial intelligence adjusts the transmitter power accordingly.

This results in up to 40% electricity savings while maintaining listeners’ comfort and service area.

Another major impact of SmartFM is to reduce the average heat dissipation of the transmitter itself. Consequently, the cooling system’s electricity consumption is reduced proportionally to the heat reduction. 

For example, the efficiency of a 10kW FM transmitter on the market is about 74%, which means that the direct electrical consumption is approximately 13kWh, 24/7. Total electricity consumption to feed one 10kW FM transmitter during one year is then estimated at 120MWh. With SmartFM, total consumption for the exact same system drops by 10% to 40%—a maximum reduction of 50MWh per year for a 10kW FM transmitter.

To help radio broadcasters reduce the electricity consumption of their FM transmission network and improve their carbon footprint, the industry must rethink the whole concept of radio broadcasting and continue to find new ways to innovate.


Blackbird – Making video more sustainable

Ian McDonough,

CEO, Blackbird


The broadcast industry has been gradually pivoting to a model that enables more of the production to be done remotely and where crew remain in a single central location or even in their own homes.

The TV industry’s effort to make production sustainable may be fatally undermined if the full cost of carbon from camera to consumer is not taken into account.

Watching online videos is not a passive activity when it comes to saving the planet. In fact, the total energy that goes into powering the internet’s data centers, servers and networks that stream video content generates 300 million tons of carbon dioxide a year — equivalent to 1% of global emissions, according to The Shift Project.

Another calculation noted by supply chain consultancy Ramprate estimated that the carbon cost of viewing linear TV in the old-fashioned manner in 2018 was 62 million tons. Meanwhile, TV streaming accounted for 19% of TV viewing yet was responsible for 31.6 million tons of CO2 in 2020—essentially doubling the emissions caused by linear TV.

The Cost Of Streaming TV

The carbon cost figures are for the U.S. only and are based on the 119 million households identified by Nielsen with homes with TVs in the country.

Alarmingly, if that trend is extrapolated to the most affluent half of the world’s population (3.8 billion consumers), then this would equate to 3.6% of global emissions. That is nearly double the annual CO2 output of the global aviation industry.

The Shift Project arrived at similar conclusions. It found that the share of digital technologies (servers, networks, terminals) in global greenhouse gas emissions increased from 2.5% to 3.7% between 2013 and 2019, and this footprint is predicted to double again by 2025.

What can be done?

Consider the carbon cost of a single email. It could be as minuscule as 0.3g of CO2, but if you sent just one fewer email a day, it could save over 16,433 metric tons of carbon in a year.

Imagine, then, the cost of transporting high bitrate video around. It is unnecessary, it is inefficient, and it is unsustainable. When we realize that every little bit counts, no matter how fractional, then awareness of the issue becomes part of the solution.

The industry is beginning to act. Amazon has committed to being net carbon zero by 2040. Google aims to be carbon-free by 2030, ensuring that its data centers are powered by renewable energy. Netflix says it purchases renewable energy certificates and carbon offsets to compensate for any energy that comes from fossil fuel sources. Content delivery network Akamai has pledged to power all of its global operations with renewable energy by 2030.

Another key is in production. AT&T-owned European pay-TV broadcast group Sky aims to achieve net carbon neutrality in all of its production activity by 2030. Dozens of broadcasters and production companies —including BBC, ITV, Endemol Shine Group and Warner Bros. — are members of Albert, an initiative set up by BAFTA to help reduce the amount of CO2 and to raise awareness of the environmental impact of program-making.

Every Little Byte Counts

A primary carbon cost of making live programming lies in transporting kit and crew to a venue. Traditionally, this involves dozens to hundreds of technicians, producers and on-air talent — which, at the largest events, involves significant air travel, road freight and hotel accommodation.

The broadcast industry has been gradually pivoting to a model that enables more of the production to be done remotely and where crew remain in a single central location or even in their own homes.

It is a movement that has been accelerated by the necessity to keep live sports on the air during the pandemic. While remote production approaches make immediate savings in budget and carbon footprint, even more can be done. This involves taking steps to reduce the amount of data - the bits and bytes of the signal that travels from venue to production hub for a program to be made.

We collaborated with Green Element on a report on how new technologies can reduce the carbon impact of routine video editing and post-production activity, and we believe browser-based workflows that function at lower bandwidths come out on top.


Move Less Data And People Around

In conventional production, including most current remote productions, all raw video feeds are transferred to the production center to be touched (such as adding graphics) before being transmitted. The vast majority of video acquired from multiple cameras at the event is transported over networks to the production center but does not form part of the final program. This is clearly wasteful.

By contrast, being able to work on high-quality “proxy” (copies) of the original video means less data is moved around. You just move the high bitrate content needed for publishing the final product — and you only need to do that once.

There is no need to constantly upload and download video every time the program is manipulated prior to going to air. It is extremely carbon-efficient, so much so that the report suggests that for a live event lasting two weeks — such as the Olympics — a browser-based solution using lower power can be six times more carbon efficient than other methods.

The transition to a browser-based solution starts by planning to make content accessible from anywhere. When content is freely available to certified team members from any internet-connected device, workflows and processes can be transformed.

The transition to a browser-based solution starts by planning to make content accessible from anywhere.

Workflows for the production of content for delivery to different outlets - social, digital and broadcast - will converge. Processes such as corporate branding and subtitling are automated in parallel. Greater efficiency reduces the transport of data and enables producers to create more content at less cost.

TV is responding. In its latest report, Albert shows that one hour of TV contributes the equivalent of 9.2 tCO2e/hr, which is a 10% drop from 10.2 tCO2e/hr in 2017. While the impact of many production activities has reduced significantly, the report also says carbon emissions from travel and transport have risen consistently between 2017 and 2019.

The media industry has a responsibility to communicate and to take a lead. We are already seeing a response to the challenge, but we can all do more.

Shure – Shure helped save more than 20 Million batteries from landfills in the past 5 years


Shure Helped Save More Than 20 Million Batteries From Landfills in the Past 5 Years

With more focus from customers on sustainable products, Shure is outlining ways it has invested in the planet through environmental responsibility initiatives.


Green is not only in the logo, it’s an important part of the company’s mission. In fact, through the company’s innovation in the audio industry, these efforts have resulted in millions of batteries saved from landfills.

One of the most impactful things Shure has done for the environment is reducing the amount of batteries disposed of and placed into landfills around the world. Concerts, theater performances, meeting and live events have gone much greener lately due to a variety of Shure products that are now rechargeable.

In the past five years, Shure estimates that it has eliminated more than 20 million batteries from being used. By 2027, Shure expects its products to prevent 100 million more batteries from going to landfills as more customers convert to rechargeable Shure products.

Shure rechargeable packs and mics have changed the way concerts, theaters, meetings and other live event venues operate. In the past, disposable batteries were used to power microphones and transmitter packs during rehearsals, then replaced with a fresh set of batteries before the performance, and replaced afterward. This led to a significant disposal of batteries.

With Shure’s rechargeable wireless audio technology – the first company to offer intelligent lithium-ion batteries for wireless microphones when it was launched nearly a decade ago – it has instantly transformed sound production into a much more environmentally friendly operation.

In fact, for David’s Byrne’s “America Utopia” alone, the monitor engineer estimated they saved 21,000 AA batteries from being disposed of in landfills because they used Shure’s rechargeable system.

Other Product Initiatives

Shure works with suppliers who take action on sustainability, including suppliers who use solar power and recycled water in their operations. The Company is also working to approve water-based paints in the finishing process, which is friendlier for the environment.

Shure products are also engineered to last – from a durability and adaptive technology standpoint – avoiding costly replacements and unnecessary disposal of electronics, even as technology evolves.

The company has also engineered its products to help with overall power consumption, using less energy in “down” modes and allowing remote monitoring of power use with Wireless Workbench Software.

Packaging

Shure has joined the Sustainable Packaging Coalition as it improves product packaging to be more sustainable. With more than 1500 different packaging pieces for a variety of different products, this is a significant undertaking. Some of the highlights include:

•      Replacing thermoform trays with more sustainable, recyclable alternatives such as molded pulp.

•      Reducing overall plastic materials in packaging.

•      Reducing the amount of literature that accompanies our products.

•      Right-sizing packaging for greater efficiency and reduced CO2 footprint from shipping and storage.

New products will be packaged using 75 percent recyclable and/or renewable materials in 2023. In addition, the Company is also improving packaging sustainability by:

Committing to source a greater portion of our paper and fiber-based packaging from suppliers that are certified by sustainable forestry organizations such as FSC, SFI and/or PEFC, with the intention to eliminate noncertified packaging by 2030.

Optimizing packaging for efficient distribution and logistics (creating packaging that better fits onto pallets and shipping containers to maximize space, which reduces transportation fuel and emissions produced by excess shipments)

Ensuring that existing product packaging is using greener methods. For example, we are working to replace plastic inserts with molded pulp wherever possible.

Shure has continued to take several steps to increase its focus on sustainability in packaging, balancing the need to protect sensitive, high-performance electronic equipment being shipped worldwide with being more environmentally responsible.

The Company recently joined the Sustainable Packaging Coalition and conducted an audit to assess sustainability of more than 1500 different packaging components. Shure has also implemented software solutions to help improve packaging design and distribution efficiency. Environmental impact assessments have been added to other environmental requirements as part of Shure’s standard process.

But even before this, Shure was implementing greener packaging. In the 1980s, Shure changed the packaging for mixers by eliminating the use of Styrofoam, switched from white (bleached) cardboard to a natural brown color, and used a soy-based ink for the printing on the cardboard box. All the packaging could be recycled, except for the plastic bag that covered the mixer inside the cardboard container.

Facilities

Shure manufacturing plants feature robust recycling programs for cardboard boxes and wooden pallets, keeping literally “tons” of cardboard waste and nearly 5,000 wooden pallets out of landfills.

Across Shure facilities, energy savings programs with LED lighting, motion-sensor lighting, smart climate control systems and other initiatives help reduce the Company’s overall carbon footprint.

People/Partners

Shure supports a number of global and locally-based organizations dedicated to sustainability efforts and ecology, including the National Resources Defense Council, whose global purpose is to “safeguard the earth — its people, its plants and animals, and the natural systems on which all life depends.” Their stated areas of work include: “climate change, communities, energy, food, health, oceans, water, the wild.”

In addition to those efforts, our Associates around the world have volunteered for environmental clean-up efforts in parks and rivers in Europe, Asia and North America.

Compliance

A cross-functional Shure team regularly reviews international regulations, directives, and standards to ensure environmental compliance with regulations like RoHS, REACH, and WEEE. The nature of these regulations promotes sustainable electronics and electronics manufacturing.

More information about Shure’s sustainability efforts is available at Shure’s Sustainability Site, which includes an overview on the Company’s approach to environmental responsibility through people, products, facilities, communities, and partners.  

Singular.live – Sustainability means cloud-native not cloud-based

Mike Ward,

Head of Marketing, Singular.live


In the summer of 2015, Hubert Oehm was surrounded by a gallery of hardware vision mixers in the middle of the intense heat of Qatar. He was working on a graphics project for a major broadcaster in a huge control room. As the wall of machines whirred and the air conditioning strained to keep them cool with the surrounding desert approaching 50C, he knew there had to be a better way.

Was there an alternative that didn’t require all this expensive, landfill-destined hardware, significant power consumption, and his presence on-site? Oehm was simply adding a layer of graphics to live video - the process suddenly seemed extraordinarily wasteful. After an intense period of research, he envisioned a revolutionary, significantly more sustainable, solution. What if all this hardware, with its huge carbon footprint and price tag, along with its specialist operators, could be replaced by an entirely web-based platform harnessing HTML?

Singular.live was conceived the following year with Oehm as CTO. It has since replaced countless numbers of these machines, along with their transportation, the power required on-site for them to function, and their operators’ travel and accommodation. We’ve helped initiate the dramatic shift from traditional broadcast infrastructure, with its hardware-based vision mixers, toward the cloud.

Yet, the industry can and must do more to avoid using hardware and make production more sustainable.

Cloud-native vs Cloud-based

When assessing the environmental impact of graphics solutions, it’s important to distinguish between ‘remote’ or ‘virtualised’ production and a cloud-native approach. Remote production calls itself ‘cloud-based’ as it utilizes the internet and therefore can be produced remotely vs. traditional on-site production workflows. Yet it still relies on dedicated hardware, whereas cloud-native does not. Dedicated graphics rendering hardware is inescapably unsustainable.

As well as the emissions required to produce and power it, it is also incredibly hard to recycle or safely dispose of. The hardware lifespan varies but it’s typically only around 3 - 5 years. The industry sometimes attempts to re-use this hardware at the end of a rights cycle but it is typically amortised over that period.

Advances are needed to make hardware’s end-of-life more sustainable by recycling components and ensuring safe disposal of its hazardous material (which includes heavy metals and carcinogenic toxins) which can enter waterways and the atmosphere. The most sustainable approach is to avoid dedicated hardware altogether with a cloud-native platform.

Assessing Our Impact

Albert – the BAFTA-owned, industry-backed organization - recognized our positive impact by awarding its sustainability accreditation to Singular in 2019, and we remain the only live graphics platform to have achieved this recognition.

Building on our Albert accreditation, we were eager to use our platform and expertise to further analyse and increase the sustainability of live production with the ultimate goal being a carbon ‘net zero’ future.

That’s why Singular instigated a project that subsequently united competitive broadcasters for the first time to collaborate on a proof of concept with the longterm vision of creating a more sustainable industry future.

The challenge was part of the Accelerator Challenge organized by IBC and coordinated with BBC Sport, BT Sport, Sky Sports, English Premier League, Premier League Productions, SuperSport, NBCUniversal and albert. 

This project has allowed the industry, for the first time, to confidently say that cloud-native production is even more sustainable than remote or virtualized alternatives as it dramatically reduces the need for hardware, therefore, reducing emissions from its manufacture, power, and transportation.

Specifically, it demonstrated a reduction in the amount of technical infrastructure required for the gallery production by up to 70% vs a remote production.  It further demonstrated that the cost of fuel usage can be more than halved versus on-site.

Collective action on climate

Being cloud-native we make assumptions that therefore we are a more environmentally friendly platform than say, going out and buying graphics hardware and shipping it around the world. But we don’t know categorically because we don’t receive any measurement from our cloud providers.  Other participants in the IBC Accelerator trial including Sky, the BBC and BT Sport echo this frustration.

This challenge is not confined to one vendor nor just graphics solutions. The entire broadcast infrastructure is being re-engineered to take advantage of micro-compute services. This distributes workloads across different servers that are shared with other companies and have spare capacity at the time. Undoubtedly, this is a more environmentally friendly alternative to building or defining dedicated computers since this enables an existing resource to be activated only when needed to as opposed to building and powering bespoke servers 24/7.

Unfortunately, using multiple shared servers does make accurate power calculations impossible at this stage. Added to that, the servers are powered using a mixed power supply infrastructure which includes both fossil fuels and non-fossil energy such as solar, wind, and hydro.

As a result of initiatives like the Accelerators, light has been shone into this black hole. Cloud providers are fully cognisant of the demands being laid down by broadcasters and are now actively engaging with the industry to collaborate on this. Singular.live are committed to working with AWS and others to develop a carbon emissions calculation methodology.

Sustainable Live Production

Sustainability is one of our core values; it was at the heart of our conception as a platform that dramatically lowers emissions by eliminating the need for dedicated hardware and transportation.

We’re proud that this aligns us with the UN’s Sustainable Development Goal 13 on Climate Action, and are aiming to embed our social and environmental mission and impact by working towards B Corp certification.

Cloud native solutions are essential to achieving effective remote working, global collaboration and to driving more sustainable working practices. If the solution is not cloud native it is just a stop gap to the inevitable future.

KitPlus Auctions – Sustainability and the environmental benefits of the circular economy

Dan Main

Kitplus Auctions


There is now widespread recognition in boardrooms and investment companies worldwide that environmental sustainability is aligned with business sustainability. Auctions in the broadcast industry have always been a great way for users to get a significant discount on the price of new and used equipment, or to generate income from assets they no longer need, but now the environmental benefits of enabling the reuse of these assets as part of the circular economy are becoming clearer for equipment distributors and manufacturers alike. The ability to measure this impact and report on the carbon avoided is a way that both buyers and sellers can celebrate their positive contribution.

The Focus on Sustainability

Investors are raising the priority of ESG (Environmental, Social and Governance), and there are multiple examples (Boohoo and Exxon to name but two) where the impact of this has been felt in the boardroom. The view is perhaps best summed up by the words of investment group Engine No.1 (not a green fund itself), "We strongly believe that climate risk is business risk". For suppliers bidding for government contracts in the UK, as of September 2021 they must have a commitment to net zero carbon emissions by 2050 in order to be considered for new contracts. So the requirements for sustainability are not some distant future requirement, they are very much in the present.

There are many different ways that businesses can work towards achieving net zero, but one of the oldest sales methods in the world, the auction, has a compelling set of reasons to incorporate in your business model.

Traditional benefits of auctions

The idea of selling your surplus broadcast assets isn't a new one - we have been helping our clients do this for years. The two traditional benefits of selling your surplus assets are :

Generating revenue from the sale of idle equipment or surplus stock

Reducing the costs associated with those idle assets (costs to store, maintain and insure).

For a seller, one of the key factors of an auction is that they are a date-certain, time-specific method of closing a sale for the best value in a competitive buying market. Manufacturers, distributors, resellers, and broadcasters can have a large stock of surplus items; this is a method of working through that stock quickly rather than having to wait for buyers to approach you as they would in a 'store' or e-commerce model. Our average auction, for instance, has between 300 and 600 lots (and some lots will contain multiple items), with >90% sell through rate.

As a buyer, you are facing longer lead times in the face of a rapidly changing technological landscape and client demands. While auctions are not generally a suitable procurement route when you have very specific requirements, if you can be flexible, you can find equipment that in many cases would require several months to source. Also, there is no protracted negotiation process. The winning bid is the final price. Invoices are issued within hours of the auction closing, and often once payment is received the item can be collected as little as 24 hours after the auction closes.

The environmental benefits of auctions

For the environment and society, there are now multiple benefits being recognised. Firstly, by buying used over new, or enabling your surplus assets to be bought, you are avoiding the raw materials needed to produce a new item. Equipment from an auction often goes to developing countries (recent high-profile auctions had buyers from 51 countries participating), and can enable the creation of entire industries in these countries that otherwise could not have afforded the equipment new. There is also a measurable carbon impact avoided, and we can produce figures from an auction that can be incorporated within annual ESG reports.

In addition, we have found that buyers can take ingenious approaches to the reuse of assets. It doesn't always have to be a valuable item to be sold at auction, and waste items can even be incorporated into a sale and/or donated to charities. During one auction in early 2022, an enterprising buyer even took the chemical toilet!

We're proud to announce that we're also introducing a carbon measurement on the site, showing the positive impact of buying used in terms of the carbon avoided from a new purchase.

Reducing costs, creating revenue and having a positive impact on the environment, all without spending money up front, makes auctioning your surplus assets an attractive opportunity. Even before the additional benefits of circulating products and materials were added to these factors, for many companies this was already a highly profitable and strategic activity. Data from asset resale can also be fed back to procurement, ensuring that information on end-of-life value is incorporated into Total Cost of Ownership (TCO) analysis. It can be used by accounting, ensuring that asset depreciation is realistic. It can also allow you to ensure that you keep abreast of current technology, if you know that you have a route to market for assets after several years of use.

It has also proven to be a popular, low risk method of raising funds, quickly and easily contributing to investment in more current and future technologies.

KitPlus Auctions is a partnership between KitPlus, established in 2005, a global online marketplace for advertising new and used equipment for sale in the Broadcast, Film and Video industries, and CA Global Partners, an auction company established in 1997 with worldwide experience in selling surplus broadcast equipment and assets from company closures.

CA Global Partners is also part of the Ellen MacArthur Foundation community, a charity working with business, government and academia to accelerate the transition to a circular economy.


For more information e:contact@kitplusauctions.com t:+44(0)1635 237 237 (KitPlus) or +44 (0) 345 163 0580 (CA Global Partners) w: www.kitplusauctions.com

Sources:

https://www.gov.uk/government/news/firms-must-commit-to-net-zero-to-win-major-government-contracts

https://www.thearmchairtrader.com/exxon-vote-big-oil/

https://www.thearmchairtrader.com/boohoo-shareholder-revolt-over-esg/

https://www.theguardian.com/business/2021/jun/04/exxon-coup-ideological-reluctant-activists-engine-no-1

Deltatre – The planet is under threat; what can our industry do to protect it?

Andrea Marini,

CEO, Deltatre


From intensifying wildfires in the US, to floods in Australia, and what is likely the worst drought in Europe for centuries, 2022 is a stark reminder of the real-world impact of climate change globally.

What we are all experiencing is unlike anything we've seen in our lifetime, and it is becoming clear that the window for action is closing. As a business with well over a thousand full-time team members across 14 countries, we have a huge responsibility, and opportunity to effect real change and this is true for business leaders across our industry. Working proactively on environmental reporting and performance-related initiatives to reduce our environmental impact has never been more vital. I want to know that business decisions we make at Deltatre are working to protect the planet for future generations. Now is the time for action that will make a difference. Despite how challenging implementation can be, by doing this collectively our industry can make a significant difference.

For many organizations, they are already on the road to sustainability by committing to goals set with the Paris Agreement. This UN initiative calls for businesses to pledge making tangible changes to their carbon footprint that would limit global temperature rises to 1.5°C above pre-industrial levels. This involves reducing emissions by 45% by 2030, to reach net zero by 2050. But what does that mean in real terms for the media industry?

The importance Net Zero by 2030

Despite the ambition of the Paris agreement, both the effects felt this summer and the data on global warming confirms that action to mitigate the impact is falling short. Based on the current Climate Action Tracker thermometer, we are on track for 2.7 degrees of warming based on real world action and current policies. We are currently at around 1.2 degrees and significant action is needed to stem rising temperatures and limit the effects that are now being keenly felt across the globe.

To achieve this, the 2050 timescale for net zero needs to be revised with a goal to reach net zero by 2030. It is a big ask, but leaving the 2050 targets in place will almost certainly mean we are guilty of doing too little, too late. As we experience the extent of the impact of global warming on the planet, as an industry we have a responsibility to acknowledge there is more we can do and implement it. At Deltatre, we are firmly focused on achieving net zero by 2030. We are taking stock and already have some secured action plans. Like the whole industry, we’re working hard to complete climate action plans associated with every aspect of our vast business on the accelerated timescale needed for a 2030 deadline.

Offsetting as a last option

Having established the net zero aim needs to come much sooner, the next step is to look at how. There is of course a place for carbon offsetting, where businesses can invest in environmental projects to balance out their own carbon emissions. However, this should only be done after most of the emissions in business operations are reduced and avoided. After that, an acceptable 5-10% of unavoidable emissions may be offset. To try and carry on with ‘business as usual’ and just pay to offset emissions is unhelpful and is of little benefit for the planet. There are not enough projects to make mass offsetting a viable option – we must reduce business emissions across multiple areas as soon as possible.

Businesses therefore need to analyse their existing practices, identify the emissions, and find a way to introduce changes which allow for reductions. There isn’t a magic wand or silver bullet when it comes to making the changes needed to slow the rate of global warming. It requires commitment from senior leaders in businesses and buy-in from all stakeholders. That way when changes are challenging to implement, everyone can understand the reason for it and transformation is more likely.

The benefits of a business-led approach

So, following the initial phase of analysing, what should action look like? Businesses have a huge role to play. In the UK businesses account for almost a fifth of carbon emissions meaning if they take action to move to net-zero we will see significant benefits. However, businesses also have huge influence. They have the opportunity to educate their employees and customers which means the potential positive impact of businesses is actually even greater.

The reality of climate change and pollution is stark – take as an example that there is now no clean water anywhere on earth. Despite this research hitting the headlines, for many it has flown under the radar. This demonstrates how critical it is that we as businesses take responsibility for educating our teams.

This process can take many forms from educational seminars for employees, to on-the-ground environmental work such as a cleaning up a local river. Regardless of format it is a foundational part of ensuring businesses make long term changes that benefit the environment. It not only makes a positive impact in its own right, but creates behaviour shifts in employees’ personal lives that take the potential impact beyond the 18% business emissions we mentioned. Furthermore, by shifting the mindset of the workforce, we change the overall business ethos putting sustainability at the centre. The impact is that business decisions organically keep environmental goals in mind. This makes the process of taking positive action on climate change part of conducting business as usual which is a big step in the right direction when it comes to achieving net zero by 2030. 

Getting our heads into the cloud

With that environment-centric approach in mind, how can our industry specifically have a positive impact on climate change? The media industry has changed drastically as a result of Covid as the pandemic forced more remote working. It meant teams that would normally travel to work on-site shifted to a cloud-based strategy in order to deliver business as usual when the world turned upside down. Those changes may have been forced by pandemic practicalities, but they should be embraced as part of our sustainability strategies.

Take for example the broadcast graphics for Indian Super League. Previously broadcast graphics would have been produced by a team on-site involving significant carbon emissions from long-haul flights. The pandemic made this impossible and made a shift to a remote, distributed operations model essential. We saw that this change did not impact the ISL’s ability to successfully deliver the 2020-2021 season. Given that business travel is one of the highest sources of measurable carbon emissions, exploring cloud-based models that have proven they are capable delivering the same level of services is a big step in the right direction for our industry.

 

Moving forward

Making significant behavioural changes and adapting how we as an industry work will undoubtedly have a positive impact on our carbon footprint. However, the data couldn’t be clearer in showing us why we must take on the challenge of meeting adjusted 2030 targets to limit the damage we are inflicting on our planet. We must continually reflect on how we can improve, from encouraging behavioural changes to adapting how we develop solutions and deliver services to minimize the environmental impact. Meeting climate ambition targets must be seen as ongoing journey not a box ticked; only by taking this approach can our industry play its part in putting the earth on the trajectory it needs to be on to protect it, both for us and generations to come.

Cerberus Tech – The Convergence of Self-Service and Sustainability in Broadcast

Chris Perkiss, 

Head of Operations Cerberus Tech


The transition to Software as a Service (SaaS) and Infrastructure as a Service (IaaS) models was already underway prior to the pandemic, but it was yet to truly transform the broadcast industry. As referenced in an article by IABM in the summer of 2019, despite the known advantages of leveraging cloud infrastructure, the fear of workflows breaking down, interoperability, and the complexities of migrating legacy systems and data, were all real barriers to transition. But the pandemic helped to change this, because suddenly, cloud-based broadcast operations were the only viable option for any kind of business continuity.


Out of necessity, many media organisations quickly moved their workflows to the cloud. This change proved that next-gen methods were both technically and logistically possible, and it became apparent that on-demand, cloud-based infrastructure offered efficiency, control, and cost savings. But what about sustainability? That’s where the industry can go beyond technical future-proofing, and start thinking about future-proofing the planet.

The transition to a self-serve culture

The interplay between spin-up/spin-down infrastructure and a self-serve approach to workflows offers lots of potential for environmentally conscious broadcasting. Despite the significant changes to the industry of late, the overall preference from broadcasters is for familiar and repeatable contribution and distribution workflows. Therefore, this needs to be carried through into an IP environment. Using a consistent approach within broadcast-grade IP means that while the implementation of resources can vary, the underlying broadcast environment stays consistent. Spinning this environment up and down, within an easy to manage interface, means that familiarity is maintained and resource utilisation is optimised.

We’re seeing swathes of media organisations transitioning their workflows and content to the cloud in order to benefit from the advantages that brings. However, certain elements of the broadcast workflow have transitioned to a self-serve mindset more readily than others. As self-serve adoption continues for areas such as cloud storage, where teams are searching and locating their own content from the archive – the outlook for IP contribution and distribution is more mixed. Overall, we seem to be heading towards a self-serve culture, but one size definitely doesn’t fit all.

Familiar and repeatable workflows

Media companies are seeking solutions that best work for the requirements of their operation, and that suit their technical capabilities and expertise. These factors play a big part in the decision about whether to opt for a self-service approach or a more managed service. While the benefits of cloud-based infrastructure are well-documented, the sense of familiarity that media professionals have with traditional workflows is often lacking when it comes to operating in the cloud. With traditional models, workflows follow a known format and the lines of responsibility are clear, in contrast, with the cloud there is a concern that engineering teams will be unable to respond to challenges.

To deliver IP feeds to multiple locations, it is important to decouple ownership for the outputs from the responsibility for input provision and rights assignment. This shares the responsibility for the provision and switching requirements, in the same way that traditional satellite and video switches operate. The content owner can create their own broadcast-grade IP environment, to deliver and monitor feeds, as well as assign rights to receive, to other organisations. Affiliates can also create their own destinations, if necessary, and this ensures control over the delivery of the feeds. Due to the separation of source and destination responsibilities, it’s possible for a broadcast operator to ‘manage’ the distribution of feeds to multiple locations through a central platform. This is how broadcast-grade IP optimises workflows at scale, even with different delivery formats. Once the framework is there, these workflows can become just as familiar and repeatable as traditional ones.

Managing resources and optimising for sustainability

It is important to look holistically at the content supply chain. Each organisation must take responsibility for its own impacts and reduce wherever possible, both on the vendor and the broadcaster’s side. From a vendor perspective, there is a responsibility to recommend the infrastructure that will work optimally for multiple use cases, rather than selling proprietary solutions.

Using a self-serve approach to contribution and distribution, allows content owners to become architects of their own content delivery, and paves the way for more sustainable ways of working. After the infrastructure is set-up, it is the flexibility of the model that offers the most benefits. With IaaS there is the option to match infrastructure usage to requirements. Broadcasters can significantly reduce wasted bandwidth and stop cloud infrastructure sitting idle. As technology evolves, it is going to facilitate a more sustainable cloud-based model for the industry. It stands to reason that on-demand cloud solutions are more sustainable than traditional hardware, which is based on-premise. This is because organisations are running less hardware, and only utilising resources when needed, which makes operations more efficient.

There is, of course, a lot less physical movement of people and hardware around the globe when using the cloud. This made cloud broadcast workflows an optimal solution during the pandemic, and it also demonstrated how a huge reduction in industry’s carbon footprint could be achieved. However, at the moment, reporting and quantifying the exact carbon footprint of those workflows is challenging.

The major cloud service providers have developed some high-level tools for measuring carbon use and impact. Unfortunately, the data isn’t presented in a consistent way, therefore it needs to be interpreted and interpretations can obviously differ. In time, with further granulation of usage data, companies can start to get into the detail of fully optimising cloud resources for broadcast.
 

An ongoing journey

Where sustainability is concerned, the industry is currently on a journey which the cloud can help to facilitate. Initially, a company's actions should be to identify impact and make some ‘quick win’ changes, but as data consistency improves we can push the expectations of what the cloud can facilitate much further.

It would be idealistic to say that sustainability is at the forefront of every media organisation’s business plan. Decisions on which suppliers to work with are primarily made on the basis of cost and technology, so it makes sense to get those things right first, and then integrate sustainability calculation features. But the huge benefit of the cloud for broadcast, is the agile model which responds to the varying requirements of different organisations. As long as a cloud and protocol agnostic framework underpins the workflows, then the technology can evolve with the industry. This concept of making incremental improvements, without ripping out the hardware and starting again each time, is certainly going to have more impact than making sustainability the primary selling point above capability and cost.

Like the transition towards cloud-based operations, sustainability is an ongoing project that many media businesses are only just starting out on. In both cases, it is crucial to make sensible choices about infrastructure, so that it is adaptable and sustainable for the long term. In fact, the two things go hand in hand. Originally broadcast organisations which were looking to reduce operational costs, then arrived at IP infrastructure to distribute and contribute content globally. It was a natural progression. Therefore, it seems likely that companies looking to maximise efficiency in the cloud, will ultimately arrive at more sustainable workflows.

There are obvious benefits to a cloud-based model over traditional workflows, but cloud-native workflows will develop this even further. The industry is heading towards a future where organisations needn’t be tethered to hardware; instead broadcast workflows can run on virtual machines in the cloud. Cloud-native workflows are still using hardware somewhere, and a machine is still generating data and consuming power somewhere. But by optimising the approach, we have the ability to only utilise and consume resources when they're needed. Then as organisations are consuming, we should work to make the right choices for the environment. This echoes the domestic sentiment for energy consumption. It's great to get your power from an eco-friendly source – but it’s even better to turn the light off when you’re not using it.

Broadpeak – Why Collaboration Between Content Providers and ISPs is the Path to Sustainable Streaming


Damien Sterkers,

Video Solutions Marketing Director, Broadpeak


Streaming is progressively replacing broadcast as the primary form of video content distribution, driven in large part by the success of global over-the-top (OTT) video platforms such as Netflix, Disney Plus, and Amazon Prime. Grand View Research predicts that the value of the global video streaming market will grow from $59.14 billion in 2021 to $330.51 billion in 2030, at a compound annual growth rate of 21.3% over the forecast period.

A significant technical challenge with streaming is that it implies one-to-one connections from the client to the network servers, whereas broadcast distributes the same video signal for all users (see Figure 1). In other words, 1 million viewers watching a hugely popular sport event requires 1 million physical replicas of the same content on the network, compared with only one for broadcast.

Figure 1. A comparison of broadcast and streaming distribution

As a consequence, video traffic has been growing exponentially. The popularity of streaming creates additional load on networks and requires the deployment of new network infrastructure, increasing energy consumption.

Strategies for Mitigating the Energy Expenditure of Video Traffic Growth

Some will argue that increasing hardware performance will mitigate the carbon footprint of video delivery, despite increasing traffic volume. But this assumption is risky, and in any case, increasing hardware performance to reduce carbon outputs is far from reaching the environmental goal set by IPCC and majority of governments. Regulations have been put into place not only to stabilize the carbon footprint but to dramatically decrease it in the coming years. (See Figure 2.)

Figure 2. Global CO2 and CH4 emission trends, courtesy of Carbon Monitor

The good news is that the amount of energy used to deliver streaming services can be reduced. There are three key ways to make video delivery more energy efficient while leveraging existing infrastructure:

  • First, third parties, including content providers, can use the ISPs’ networks to decrease the need for new infrastructure and move delivery closer to end users (which also improves QoE). Ultimately, collaboration between ISPs and content providers can result in more revenues for ISPs, allowing them to maintain their network and improve their energy efficiency.
  • Second, content providers can stream video via multicast ABR. With this approach, only one stream is delivered over the network to address millions of viewers compared with one stream per viewer in a  traditional ABR delivery scenario.
  • Third, operators can continuously optimize the integration of the CDN software on hardware, with the objective to reduce power consumption for the same streaming throughput.

Ultimately, relying on incremental optimization of network components won’t be sufficient for what is at stake. Industry professionals need more data and standardized practices to be able to speak the same language, work together, and develop best practices. Clearly, this requires coordination across the ecosystem. If streaming platforms, telcos, and their technology providers are committed to making an environmental impact, they need to work together to find new ways to deliver video.

How Content Providers and ISPs Can Collaborate Via Edge Caching

A good starting point of collaboration between content providers and ISPs is to minimize the deduplication effect mentioned previously and ensure that the same program is never unnecessarily replicated continually on networks. This is a problem that ISPs have already addressed with a simple solution that could easily be extended to third-party content: edge caching.

The principle of edge caching is to send content only once and to cache popular content deep in the ISP network so that duplication and streaming are done as close as possible to the end user. This allows for dramatic savings in network infrastructure (see Figure 3), which consequently reduces the environmental impact. Ultimately, it’s a win-win situation for everyone: end users can enjoy better streaming quality, which benefits both the content provider and the ISP. 

Figure 3. Video content delivery pathways with and without edge caching

Moreover, external streaming content making use of ISP caches can, at the same time, take advantage of all the software optimization that ISPs have been implementing to distribute their own content. ISPs have been offering video services long before OTT platforms gained prominence. Content providers could leverage ISPs’ delivery experience to their own advantage, improving environmental sustainability.

One good example of such an optimization is for content providers to leverage the IP multicast capacity of ISPs whenever available. OTT ABR can, thanks to multicast ABR (MABR), transit throughout the network in the same one-to-many mode as broadcast and be delivered as ABR directly in the home network via a dedicated conversion process. The conversion typically takes place in an IP gateway or a set-top box. (See Figure 4.)

Figure 4. Video content delivery pathway with MABR caching

Another example is caching elasticity — namely, the ability for ISPs to progressively push the edge of their network further and dynamically adapt cache instances on the actual streaming demand from end users. This evolution is particularly relevant for mobile networks and has been popularized by the Multi-Access Edge Computing (MEC) initiative developed in the 5G standard.

The concept of ISPs sharing their video distribution with third-party streaming content providers — often referred to as “Open CDN” — may seem technically straightforward, but it can be complex in terms of defining the relationship between the two entities:

  • If a content provider has an international offering, it will likely want to work with a few different ISPs to establish a relevant footprint. Conversely, ISPs with such capability will want to onboard many content providers to rationalize their investment. The technical interface between them must be as simple to implement as possible to ensure scalability. Several technical tools are being developed with that purpose in mind. The SVTA Open Caching standard is expected to play an important role in setting up a common technical framework.
  • Content providers and ISPs have only recently started discussing business agreements, and it will take time before the topic matures enough to reach a consensus on who provides which service and for what price. That said, content providers today have normalized business for their content delivery with public CDNs, and Open CDN could use the same model as a starting point of reference, potentially accelerating its adoption.

Conclusion

Looking at the current situation, the potential for improvement is huge, and caching all streaming content deep in ISP networks is one of the most obvious approaches to start with. If video streaming stakeholders want to comply with the minimal environmental targets, it is inevitable that they will need to start collaborating.

Of course, collaboration will require content providers and ISPs to develop tools and practices to make their interactions as simple and scalable as possible. The outlook for that happening is optimistic, given that everyone would benefit: the content provider will realize better service quality; the ISP will optimize infrastructure costs; and the planet will be greener since there will be less network equipment deployed.