cloudswXtch – NAB Show 2023 BaM Award Winner – Connect

cloudswXtch is a virtual overlay network that brings on-premises features including multicast distribution to the cloud. Deployable as a software solution within a physical broadcast network or a cloud tenant, cloudSwXtch helps broadcasters and video service providers merge on-premises and cloud networks, migrate demanding media workloads, and establish mesh configurations to create global networks. cloudSwXtch unlocks missing network features that are required for demanding, high-throughput workflows in broadcast and media applications including multicast, broadcast, packet monitoring, network path redundancy, and protocol conversion and fanout.

cloudSwXtch is the first solution for merging on-premises and cloud workflows around the world, empowering customers with previously unavailable cloud features to migrate demanding media workflows. As a result, cloudswXtch addresses inherent hurdles in conventional cloud architectures with problem-solving features. With its multicast support, cloudswXtch simplifies traditional cumbersome network reconfigurations and allows users to move network packages in a very efficient manner. swXtch.io has also added PTP clock synchronization to cloudswXtch to improve cloud networking features. PTP enables synchronization of video and audio sources in cloud networks, and ensures that data is received and processed in the correct order. Neither multicast nor PTP were previously available on public cloud networks.

cloudswXtch also makes it easy to interconnect an abundance of interface protocols through its protocol conversion and fanout capability. For example, cloudSwXtch can seamlessly translate between UDP Multicast, UDP Unicast, SRT, and other protocols, allowing endpoints with different protocols to interact on the same network with no configuration or management. Along with SMPTE 2022-7 network path redundancy (hitless merge) for high availability, full support of uncompressed SMPTE ST-2110 workflows, and dynamic ground-to-cloud and cloud-to-cloud bridging, cloudSwXtch genuinely breaks new ground by making these important and valuable features available on public cloud networks.

cloudswXtch genuinely breaks new ground in cloud innovation within an industry that remains hesitant or uncertain about their cloud migration strategies due to both business and technology barriers. cloudswXtch opens doors for broadcast and media organizations that were simply prevented from migrating media workflows to the cloud, as cloud engineers generally favor developing solutions that solve more general-purpose problems than what the broadcast and media industry requires.

cloudswXtch’s virtual architecture solves this problem by bringing bare metal party to cloud networks that broadcast can build upon, beginning with features that matter most and adding new features as required. cloudswXtch’s virtual architecture also establishes a long-term development tool for broadcasters, video producers and cloud networks such as Amazon Web Services, GCP, OCI, and Microsoft Azure – and even create workflows across multiple disparate clouds for higher availability. cloudswXtch is available in several variations based on initial customer needs and can be scaled and built upon for years to come.

qibb – NAB 2023 BaM Award winner – Manage

qibb – the integration platform

qibb is the leading media integration platform to create and maintain low-code media workflows. By consolidating numerous tools and services into a unified environment, it empowers users to build and maintain integrations more cost-effectively, faster and vendor independent – making media workflow automation and integration easier than ever before.

At the core of qibb lies the flow editor, which serves as the central hub for designing and managing media workflows. Flows can be created simply by linking nodes (connectors) using drag and drop. In each node, a user configures the operation to be performed with the integrated application to build automated workflows very quickly. Over 60 integrations from media asset management and transcoding to playout and streaming enable the automation of workflows along the entire media supply chain. qibb’s comprehensive catalog of pre-built flows (templates) further provides a wide range of workflows for common use cases and allows users to get started right away. The modular dashboard system of qibb enables fast configuration of interactive user interfaces to visualize data, input metadata, or trigger workflows manually.

One of the major USPs of qibb lies in its harmonized low-code API standard. The continuously growing catalog of integrations can thus be seamlessly displayed and customized in a single environment. Thanks to the drag-and-drop philosophy, even users without a programming background can be quickly on-boarded. Not only can they build their own flows, but also flexibly make necessary adjustments in the future. This reduces dependencies on both external service providers and internal departments and increases agility and efficiency, keeping control where it needs to be.

The unified standard provides another major advantage: independence from vendors and products. Tools can be easily replaced as needed without interrupting or having to fundamentally reconfigure workflows. This flexibility prevents silos from forming and allows integrators to use the most appropriate services for each workflow component.

qibb’s ambition to constantly drive development and stay one step ahead is exemplified by integrations of cutting-edge technologies such as Generative AI in form of OpenAI’s ChatGPT. One of the most popular qibb workflows combines ChatGPT with a media asset management system, a transcription tool, and social media platforms such as LinkedIn or Twitter: the moment a new video is uploaded to the MAM, the workflow is automatically triggered to generate a transcript of the audio stream. This is processed by ChatGPT into a summary of any length, from which texts and comments are then generated for the connected social media platforms and, if desired, posted automatically. No matter which new technology emerges in the future, qibb can provide the ideal platform to benefit from it right away – fully integrated into each user’s existing system landscape.

The qibb media integration platform sets new standards for low-code workflow creation and management. With its user-friendly interface, comprehensive integrations, and flexibility, qibb empowers users to streamline their media workflows, increase efficiency, and achieve greater control over their processes. By embracing cutting-edge technologies and offering vendor independence, qibb paves the way for the future of media integration platforms.

Marquis – NAB 2023 BaM Award winner – Project, Collaboration or Event

Sinclair transforms news gathering through new production partnership
with Avid, Sony and Marquis

New and innovative cloud-based workflow uses Avid MediaCentral, Sony Electronics’ C3 Portal gateway service and Marquis integration to get news to air – with all associated metadata – faster than ever

The Marquis team at NAB Show 2023

Sinclair is deploying and refining a new innovative cloud-based news and content gathering and production workflow by combining technologies from Avid, Sony and Marquis Broadcast. This novel method of news sequence relinking – now in operation at six Sinclair stations – has accelerated 5G news acquisition, reducing the volume of high-resolution video transferred to news production centres by field journalists. By leveraging 5G networks, Sinclair is limiting acquisition cost increases, creating a continuously enriched stream of news metadata and accelerating the speed of its news operations.

Sinclair’s strategy was to democratise news acquisition, with every journalist or photographer having fast content delivery to news production centres. This transformation required a collaborative approach to streamline the workflow, and Sinclair engaged with Avid on its MediaCentral news production platform, Sony on cameras and C3 Portal and Marquis to enable the complex workflow integrations. Since the workflow was implemented earlier this year, Sinclair has enabled real business and operational benefits, with fast story turn-around and structured metadata automatically delivered to newsrooms. Its newsrooms can cut several versions of the same story to deliver to digital platforms faster and with improved relevance.

Fundamental to the news workflow is the automatic preservation and enrichment of metadata from the point of origination. The Sony C3 Portal supports pushing metadata to the camera before shooting commences – for example, iNews slug information and assignment data – preserving this metadata throughout the production process when the story checks into Avid MediaCentral. Once a shot is complete, the proxy video is automatically transferred via 5G to the Sony C3 Portal. Marquis polls the C3 Portal for new video and automatically transfers the new proxy, rewrapping the file into Avid OP-Atom. The file is written into the Avid NEXIS and the asset is imported into MediaCentral, placing the file into the correct story location.

Once the edit is complete, the editor can request the high-res content via the Marquis sequence relinking process. This automatically extracts partial segments from the high-res file in the Sony C3 Portal, ensuring only the required content in the sequence is brought into Avid. Assuming a 10:1 shooting ratio, this partial extraction process results in a 90% reduction in 5G data transferred and network bandwidth required. Using the dynamic relink feature in Avid, the high-res content simply drops back into the timeline and automatically substitutes the low-res content. The Marquis integration enables the ‘edit while record’ feature in Avid Media Composer, so stories can be edited as soon as the proxy file starts ingesting. And for IBC 2023, the process will also support restoring partials directly from the camera via 5G or from the camera card.

This ‘sequence relinking’ enables a fundamental change in news acquisition by seamlessly combining Sony news acquisition with Avid news production technology. The ‘as-shot’ content and metadata is automatically pushed to the news production centre in a carefully structured way, enabling stories to be edited and versioned for more markets hours faster than would otherwise be possible. This workflow was achieved with the vision of Sinclair plus Marquis’ close co-operation with Avid and Sony.

Aximmetry Technologies 3D cleanplate feature – NAB 2023 BaM Award winner – Create

The essence of virtual production is to present a perfect illusion of reality. Aximmetry virtual production platform has to create a close to perfect virtual environment in real-time to be able to immerse viewers in the virtual world. To preserve this illusion of reality for the audience the 3D background has to move in sync with the real camera. This is the reason why most advanced production facilities use camera tracking today and this is why the Aximmetry software is compatible with all camera tracking solutions.

To further perfect the illusion of reality, Aximmetry’s software-based chroma keyer achieves seamless integration of real and virtual environments with the interaction of virtual lights and real-world objects. Aximmetry can cast virtual shadows on the talent and combine these seamlessly with the life-like shadows generated by the talent itself.

Clean plate is an image that only contains the green screen, but not the subject(s) of the production. Traditional clean plate technology only works with fixed cameras. Even lighting of large-scale green screens often poses a challenge and results in sacrificing keying quality when the camera moves. Until now, clean plate could only be used with a fixed camera or productions simply had to compromise on keying quality.

There are no obstacles to creating a complete illusion for the audience anymore. Aximmetry’s new 3D clean plate feature is developed for achieving perfect keying results even with a moving camera. It works in 2 phases:

Preparation phase: Users create the model of their green screen and set in Aximmetry.

Production phase: After setting the lights users sweep over the green screen with the camera. As a result of this process Aximmetry generates a virtual map, by simply and quickly recording multiple images of the green screen session by session.

Using this model and the virtual map during production Aximmetry creates a perfect clean plate for every angle.

Aximmetry is committed to keeping its software and services accessible for all creators and remains at the forefront of virtual production development. Aximmetry Chroma Keyer is available in all software editions (prices starting from EUR 3 / month). The new 3D cleanplate feature is to be included in all Aximmetry Broadcast editions after our next release (prices starting from EUR 490 / month or EUR 49 / day). The next release is to be expected in Q2 2023.

 

About Aximmetry:

Aximmetry DE software editions with complete Unreal Engine integration are tried and tested by tens of thousands of users around the world from indie content creators, to large production houses using multiple licenses for complex productions.

Aximmetry is the studio glue that holds these productions together by easily interfacing a multitude of hardware used in the broadcast environment. There are various input and output possibilities: LED screens, tracking systems, numerous controlling devices and API to name a few. Its superior quality live compositing, flexible scripting tools and user-friendly features for realtime pre- and post-processing of the rendered image make Aximmetry a great facilitator of all virtual productions.

Accedo Xtend – NAB 2023 BaM Award winner – Consume

The metaverse has the potential to change how we approach everything, from gaming to fitness, sports, socializing, education, training, shopping, travel and even healthcare. There is a massive appeal with XR experiences, especially for live events and sports. Fans will no longer be simply viewing the concert or game on a screen, but instead will be virtually present at the event, interacting with other fans. At Accedo, we believe video will have a huge role to play in driving adoption and delivering improved experiences to consumers.

XR is making the metaverse more accessible thanks to the growing availability and decreasing cost of XR devices. Accedo’s recent acquisition of eyecandylab will help bring it to the mass market by developing the Accedo Xtend suite, designed for sports organisations and operators, making it compatible with any smartphone in the market.

Building on Accedo’s extensive immersive experience, the Accedo Xtend suite opens up new opportunities for broadcasters to engage consumers with a branded, immersive XR experience without needing to completely redesign their offering. With this solution, Accedo is making XR video experiences accessible, helping content providers to deliver a whole new level of interaction, open up monetization opportunities, and experiment with what appeals most to their audiences. No-one yet knows what will or won’t work in the metaverse. The only way to discover that is through experimentation with existing content and audiences.

The Accedo Xtend suite makes it possible to bring video apps to virtual reality (VR) and augmented reality (AR) devices by enabling media organizations to easily create new metaverse experiences by simply embedding existing web-based applications into a metaverse environment.

As a modular solution, rights holders and broadcasters can choose only the add-ons they require, such as multicamera, live stats, or co-watching using extended reality (XR) powered avatars. Offering pay-as-you-grow, allows organizations to get started in the metaverse in an easy and cost-efficient manner with a solution that enables them to test the waters while building an optimal strategy for monetization and end-user engagement.

The solution is customizable, meaning sports organizations and operators can create a unique experience and strengthen their relationship with audiences, partners, and sponsors; something especially valuable in the sports space.

The Xtend suite can be easily integrated with any existing vendors and creates new ways for consumers to engage with brands beyond video, monetizable through creation of sponsored interactive experiences or ad-placements. The multiplatform solution works with most common VR and AR headsets on the market, while ensuring a consistent and seamless user experience. The SDK developed by eyecandylab allows synchronization of TVs with AR glasses or smartphones, creating a layer of seamless interaction between the content and the video fan.

For more information about Accedo Xtend, get in touch and book a demo.

Standards – What’s happening with Technology Standards?

Paul Treleaven, IABM Technology Specialist

In these update articles, I often launch directly into  topics related to the SMPTE and AES meetings that the IABM attends. However, there are numerous Standards bodies that have some impact on our industry, so I have put together a table to show how the picture fits together.

Later in this article we also describe some other organizations that contribute valuable standards-like work to our industry.

STANDARDS BODIES; ACRONYMS and LINKS Main relevance to IABM members
AES Audio Engineering Society Audio:  Interfaces, Streaming, Metadata, File Format
IEC

TC 100

International Electrotechnical Commission AV systems, though tending to be domestic rather than professional
IETF Internet Engineering Task Force IPv4, IPv6 and multiple documents on IP transport of media (inc. SDP, SIP)
ITU-R International Telecommunication Union; Radiocommunication Multiple video formats – SD, HD, UHD; often parallel SMPTE docs.
ITU-T International Telecommunication Union; Telecommunication Compression standards (jointly with ISO-IEC); fundamental data networks
ISO-IEC Joint body International Standards Organization and IEC Compression standards (jointly with ITU-T) – includes MPEG and JPEG
SMPTE Society of Media Professionals, Technologists and Engineers Video formats, compression, metadata, file formats, network architecture, systems, mastering

Are Standards Relevant to Software-based Media Products?

This question is prompted by the software landscape where individual developers get an idea for an app and design it to work exactly as they want it, with no constraints. Those developers expect to find everything they need by online searches – even when their app requires a Standard, they are used to the Internet Engineering Task Force (IETF) paradigm of free specifications. So the paywall to obtain Standards is becoming acknowledged as a problem by Standards Development Organizations and some are looking for ways to make their documents freely available.

However, returning to the question, for the media industry, Standards are every bit as relevant to software implementations as they were to hardware. It is just that they are different; no longer custom interfaces over BNCs but network interfaces over IP, filesystems and workflow products. Standards have encouraged customers to embrace new technologies  – e.g. IP with ST 2110, streamlined mastering with IMF, new content formats and their mapping into MXF.

Have Standards Meetings Recovered from Covid yet?

Standards Development Organizations adapted quickly to online meetings via Zoom or Teams. Mostly they worked well, though being international organizations, meeting times in some timezones were painful. Meetings were encouraged to be brief online to reduce screen-time fatigue, but that also discouraged discussion of issues that needed attention and I noticed that people put less effort into preparation of documents and attention to detail on status reports for “just another Zoom call”.

SMPTE, MPEG and IEC have returned to in-person meetings with some online sessions; a good balance. AES has stayed with online meetings – at least for the time being.

Standards Update – AES

Much AES standards work concerns audio fields that are only marginally relevant to Broadcast and Media such as acoustics, forensics. Some activities on interfaces have been very relevant in the past (AES3, MADI, connectors) but there is no important new work. The following network-based work items are relevant:

Streaming Audio Metadata over IP networks

This project defines a standardized method for transporting metadata associated with an AES67 stream’s audio content. The audio metadata is transported in a separate stream that is sent in parallel to, and synchronized with, the AES67 stream. This Standard will use SMPTE Fast Metadata (ST 2110-41, well-advanced through standardization) for transport and this AES project is expected to be completed this year.

Revision: AES67 – High-performance streaming audio-over-IP interoperability

This standard was introduced in 2013, revised in 2018 and is now undergoing further revision with an expectation of publication this year. A big motivation for revision was restructuring to make the Protocol Implementation and Conformance annex easier to specify tests. Changes are mostly extensions or clarifications so that backwards-compatibility can be maintained, though a few changes to overcome network operation issues have been agreed.

Revision: AES70 – Open Control Architecture suite (and related documents)

The last published version of this 3-part suite – 2018 – is undergoing substantial revision. The biggest change is a much-improved connection management process, though many other improvements are being introduced to support new “adaptation documents” that enable AES70 to control AES67, IEEE’s Milan streams and (very new) Audinate Dante® streams. An additional Part 4 for the suite will define JSON protocol.

Standards Update – SMPTE

SMPTE’s Public CD process

This process has quickly become the norm for new standards. It exposes the document to the public at the Committee Draft (CD) stage before it has been balloted. The public becomes aware of the work much earlier than it would with the full publication process and it permits implementers to “test” the provisions and propose improvements on GitHub that will be considered for the document as it proceeds to publication.

At the time of writing there are 20 documents on the public CD page including 12 whose review period is closed.

SMPTE’s Rapid Industry Solutions (RIS)

This initiative recognizes the need for agile solutions to technology challenges. More details. The first RIS topic is On-Set Virtual Production, with an output document just released on Camera and Lens Metadata. At NAB 2023, a second RIS project was launched by taking over the work of the Open Services Forum on Media Microservices.

Media over Managed IP Networks

The ST 2110 transport suite is approaching completion,  with 10 parts published and another on the point of publication. Details on published parts. New parts in development are:

  • Fast Metadata (FMX)
  • Timing Planes (using timestamp features to automate computation of chain latency)

Media Microservices

Two Microservices draft Standards are on the public CD page  – “IMF Registration Service API” and  “Microservice Status Reporting and Logging”. Development is underway on “Job Processing Architecture”. See the note above about this work moving into Rapid Industry Solutions.

Interoperable Mastering Format (IMF)

This suite of documents defines a method for simplifying mastering for generating multiple distribution formats. Much of this suite has long been published. Details on published parts.

New IMF development projects are underway on an IMF Application for VC-6 compression and an IMF Application for VC-3 compression, as well as new documents defining macros for Output Profile Lists.

UTC-Aligned ST 12 Timecode

This is a very new project that we have started to monitor, though some doubt has been expressed about the need for it.

Media Standards Work in Other Organizations

The Video Services Forum has published Technical Recommendations here on Reliable Internet Stream Transport (RIST), JPEG XS transport, ST 2110 over WAN, and a Pro-AV version of ST 2110 called IPMX.

The Advanced Media Workflow Association (AMWA) has an important – and growing – family of Networked Media Open Specifications (NMOS) that complement the SMPTE ST 2110 suite with these interoperability specifications:

  • Discovery & Registration
  • Device Connection Management
  • Event & Tally
  • Audio Channel Mapping
  • System Parameters
  • Authorization
  • Work-in-progress on: Stream Compatibility Management, Control Protocol, Annotation

Documentation here together with data models and Best Common Practices.

The Joint Task Force for Networked Media (JT-NM) has published “System Environment and Device Behaviors For SMPTE ST 2110 Media Nodes in Engineered Networks – Networks, Registration and Connection Management” – document TR 1001-1. It sets further requirements beyond ST 2110 and NMOS to achieve interoperability in media networks.
There is also a “JT-NM tested” program for validating products against ST 2110, NMOS and TR 1001-1. The last testing session was August 2022 – see explanations and catalogs here.

IABM Standards Resources

We welcome enquiries from members about Standards.

The IABM supports the development of standards that underpin the technologies used in Broadcast and Media. Much of that effort occurs “behind the scenes” – in SMPTE and AES online drafting groups where the standards take shape.

Standards Meeting Reports

We also attend AES and SMPTE meetings and produce reports covering all the projects that we think members may be interested in.

You can find all quarterly SMPTE reports here and all twice-yearly AES reports here – or click below for the latest (login required).

 The IABM Standards Monitoring Group (SMG)

For this activity, we have selected SMPTE and AES project groups (often with members’ recommendations) and we participate in their teleconferences – typically held weekly or fortnightly. This provides an up-to-date picture of the state of their documents and development work.

Regulations – What’s all the PFAS about?

Nigel Burtt, IABM Technology Specialist

Twenty-three years ago, in June 2000, the European Commission published proposals for their WEEE and RoHS Directives and three years later they were agreed and entered into force, having a major impact on all industries. The RoHS Directive originally restricted six substances and effectively banned the use of lead, mercury, cadmium, hexavalent chromium and two types of flame retardants in most new electrical and electronic equipment sold after 1 July 2006. This had a major impact on our industry mainly due to the need to use lead-free solder in the manufacture of electronic equipment. The Directive has been amended several times since and, as of 2015, now there are ten substances that RoHS restricts along with the equivalent UK post-Brexit law that replaces it. Very similar laws and regulations now also exist for other countries and their markets, such as Japan, South Korea, China, and India.

Alongside the RoHS Directive, there is Europe’s REACH regulation for the control of chemicals which came into force in 2007. This maintains a  ‘Candidate List’ of Substances of Very High Concern (SVHC} that may be subject to future restriction within the European Union and, at the time of writing, includes 233 different entries. From the original list, 59 of these already now are restricted, each with a final ‘sunset date’ and need special authorisation for any usage beyond that date. With updates made to the REACH lists every six months, this presents a difficult moving target to manufacturers and distributors who sell products in Europe with a continual re-assessment needed of the substances used in their manufacturing processes and within the components of those products.

This situation will soon become even more difficult with restrictions being proposed on substances known as Perfluoroalkyl and Polyfluoroalkyl Substances (PFAS.) PFAS are sometimes called “forever chemicals” because they do not naturally breakdown in the environment, so there is wide concern over the longevity of their polluting potential.

Some types of PFAS are already restricted by global and national laws and regulations, but the total number of different PFAS that are likely to become restricted is huge. The global Organisation for Economic Co-operation and Development (OECD) maintains a free MS-Excel database of PFAS which already includes nearly 5000 different entries, but a new European Union proposal, published in February this year for future control of all PFAS usage within the EU under the REACH regulation, suggests as many as 10000 types will be affected. Hence, the complexity of managing electrical and electronic equipment (EEE) supplier obligations relating to controlled substances for reporting their usage of and removing them from the supply chain will likely increase by around an order of magnitude in the very near future.

As with RoHS, there are post-Brexit UK regulations that mirror the EU REACH rules, so the UK government has also been consulting stakeholders about its own possible response to PFAS usage and very recently published a report from the Health & Safety Executive (HSE) into the regulatory management options it may consider as part of the wider post-Brexit UK version of REACH. That said, the UK government is currently somewhat mired in its efforts to repeal EU legislation, such as REACH, for the completion of Brexit and the ‘Retained EU Law (Revocation and Reform) Bill’ (REUL) is not progressing as fast as it had hoped.

And it’s not just the EU and UK that are considering restrictions on PFAS – the HSE report referred to above identifies actions proposed or already being taken by the USA, Canada, Australia, Japan, and China. In the USA, the Environmental Protection Agency (EPA) has already taken action and there are proposals in play to go further. Various States already have rules in place without yet applying these to EEE product. However, most recently, at the beginning of 2023 the State of Maine introduced a law that includes a requirement to report any product sold, offered for sale, or distributed for sale in Maine which contains intentionally added PFAS. This requirement is for ‘…all products and product components sold in Maine for personal, residential, commercial, or industrial use are subject to this program. If a product is offered for sale in Maine for one of those purposes, the Manufacturer of the product must report the amount of PFAS in their product.

The likelihood of these restrictions is already moving chemical suppliers to phase out PFAS and, where possible, to offer PFAS-free alternatives. For example, 3M have a dedicated website for PFAS and have announced that they will stop manufacturing substances containing PFAS, and remove any products affected from sale, by the end of 2025.

The problem of restriction and removal from the market of PFAS is the wide number of end-use applications across many industries. The Royal Society of Chemistry published a paper in 2020, ‘An overview of the uses of per- and polyfluoroalkyl substances (PFAS)’, which summarises many of the main applications in the Appendix, and this highlights several which are vital to EEE manufacturers, for example:

fluxing agents in solder pastes;

printed circuit board (PCB) layer construction;

lithium batteries;

cable and wire insulation;

capacitor dielectric materials;

liquid crystal displays (LCD);

process chemistry for semiconductor device manufacturing;

flame retardants in plastic materials.

So, how do we prepare to meet these challenges to our business? Consultancy organisation RINA reminds us that many PFAS exist in mixtures and are not yet classified as hazardous by the Global Harmonised System (GHS) so are not covered by national Classification, Labelling and Packaging (CLP) laws that implement this and as such they may well not appear on Material Safety Data Sheets that we rely upon to identify problematic substances. RINA recommend the following actions:

  • Contact all your material and component part suppliers and ask them to identify any items in your supply chain which are known to contain PFAS;
  • Identify all uses of PFAS in your manufacturing process and in component items within your products and start a programme for qualifying suitable alternatives wherever possible;
  • If you identify any PFAS usage that will still be required beyond 2026, engage directly, or via your trade association, with the European Chemicals Agency (ECHA) and prepare detailed information on the usage, risks, existing and future proposed control measures and the availability of, or lack thereof, suitable substitutes.
  • Submit detailed socio-economic reasons for PFAS usage before September 2023 to the ECHA consultation process to request a possible future exemption or derogation.
  • Review and respond to ECHA’s Risk Assessment Committee (RAC) and Socio-Economic Analysis Committee (SEAC) when they publish their response to the consultation process, which would be expected in Summer 2024.

Some industry trade bodies are already encouraging their membership to engage, for example IPC have published an ‘Urgent Call for Action’, suggesting collaboration across the industry and its supply chain partners to understand what information on PFAS usage can reasonably be provided to the ECHA consultation before the September consultation process deadline. If this is something you would also like IABM to organise, please let us know as soon as possible. IPC do make an important point that if your usage of PFAS amounts to business confidential information in some respects, you may not want to disclose this to a collaborative forum and you should submit the required information directly to ECHA yourself.

RIST Forum – IP Standardisation and Real-World Restrictions

Suzana Brady, Chair, RIST Forum

The ongoing surge in content demand has forced the media industry to adapt its workflows and find efficient ways to deliver high-quality content at lower costs. The utilisation of IP for contribution and distribution offers evident advantages to users, and the technology is evolving all the time. IP has proven to be cost-effective, adaptable, and quick to set-up. Many broadcasters are finding that resources are easier to manage, due to broadcast environments being spun-up or down to meet capacity requirements.

Technology vendors have risen to the challenges faced by broadcasters and content owners and continue to deliver new IP features that streamline content delivery. These innovations have been implemented across a wide range of hardware and software solutions, offering more choice and flexibility. We are now moving into an interesting era for IP, where innovation and insights can be consolidated and leveraged to benefit the whole media industry.

Expectations and Challenges

In addition to the obvious advantages of affordability and flexibility, the integration of IP has brought several workflow benefits to media organisations. Many media companies have migrated their post-production and playout platforms to the cloud. This means workflows are becoming better connected throughout the chain, and so there is less risk of content being siloed and or arriving in an incorrect format. But transporting video over IP is not without its challenges, particularly when there are so many moving parts at play.

The consistent delivery of high-quality, broadcast-grade video can be challenging when using IP, mainly due to the variability of Internet performance. This becomes particularly problematic for contribution purposes when there is a lack of guaranteed packet delivery. Video streams, especially compressed ones, can be severely disrupted by lost data packets and this causes significant problems in a broadcasting environment. Latency and packet-loss were big concerns for early adopters of IP transport, but technical experts have made huge steps forward in resolving these issues.

Technical Leaps Forward

Visual and audio signal impairments, signal interruptions and even the complete disruption of a stream can be caused by the loss of a single packet. But recent developments in packet-loss recovery have changed the way that the industry views this technology. The use of Automatic Repeat Request (ARQ) in multi-vendor scenarios, provides selective retransmission of lost packets so that any information lost in transport, between sender and receiver can be recovered and latency kept to an absolute minimum.

Point-to-multipoint and IP multicast enable broadcasters to efficiently deliver transmissions to multiple places. Bonding allows the user to combine multiple links in parallel to protect content delivery. When used in conjunction with seamless switching, users can send two or more copies of the stream over redundant IP links. This is particularly important for high-value content. It means that an interruption to one link doesn’t impact media delivery, because the content is mirrored on another stream.

The overall reliability of IP video transmission can also be improved using networking support for multiple ISPs. Retransmissions can also then be sent on both of the ISP’s connections to safeguard the reliability of a stream. Then the receiver combines both streams to produce a single stream, dynamically removing duplicates as they arrive. This is vital for the transport of gold star content, such as tier one live sports, and gives broadcasters additional peace of mind.

Maintaining High Standards

So how has IP transport development made such significant progress in such a relatively short space of time? Standardisation has a huge part to play. By creating clear, open-source standards in the first instance, a bar for technical quality is set – and it is set very high. Vendors are then able to innovate with their own implementations and unique solutions, learning from each other during the process. This standards-based approach provides a platform for collaborative development work that a diverse group of technical experts can contribute to.

By joining forces to work on an open-source project, engineers collectively contribute hundreds of years of real-world experience to the specifications. Moreover, if the initial project is built on a foundation of established RFC standards, then the development will leverage a wealth of knowledge and lessons from previous sources. This collective approach to defining standards and innovating on them offers the influence of more real-world experiences than any single vendor could hope to provide.

Collaboration is the Key

The biggest challenges occur within IP broadcast when equipment and protocols from different vendors are incompatible. Unknowingly, some broadcasters have found themselves locked into restrictive technical frameworks, but this needn’t be the case. IP needs to offer both a secure and technically robust solution, but one that is adaptable to any application or use case, as well as being deployed in any environment. Standardisation allows broadcasters to mix and match solutions confidently. It enhances efficiency without compromising quality and ensures minimal technical requirements with easier modifications.

Ensuring seamless integration and pre-emptively resolving potential issues, are of paramount importance for broadcasters when working alongside their technology partners. The use of IP as a transport mechanism has become increasingly appealing, due to its efficiency and cost-effectiveness in handling large volumes of high-quality content. However, for reliable and broadcast-standard content transportation, it is essential that vendor solutions for video transmission and reception are interoperable. Encoders and decoders must communicate effectively by speaking the same language. Even if both ends claim compatibility, the absence of the appropriate broadcast-grade protocol can compromise the video quality.

During this process, it’s important to be realistic about the challenges associated with transporting video over IP. By starting with foundational protocol and innovating collaboratively, everyone is on the same page. Users see the benefit by being able to seamlessly integrate solutions from various vendors, selecting and combining them as needed. With products covering every aspect of the contribution and distribution workflow, broadcasters have a more streamlined approach to transporting content over IP. The journey to interoperability becomes just as important as the results, as different perspectives can help to enhance and inform any priorities for the project. What has proved to be crucial is the establishment of a robust set of IP standards – and the only way to get there is by working together.

Red Hat – Empowering Media Companies with Interoperable Containerized Platforms

Jan-Pieter Laauwen, Sr. Global Partner Manager for Telco, Media, and Entertainment, Red Hat

Fabrizio Bosi, Sr. Business Development for EMEA Telco, Media, and Entertainment, Red Hat

Evolution in the Media Industry

In today’s fast-paced digital landscape, media companies face the dual challenge of delivering content efficiently while addressing environmental concerns. As the media industry evolves, it is crucial to adopt innovative approaches that transform existing media workflows, security, scalability, and carbon efficiency. To further future-proof media infrastructure and experience these benefits, the industry should focus on developing interoperable Software Media Functions (SMF) as software-based solutions, adopt a dynamic media factory layered model, and implement a common control framework to better anticipate the changing media functions landscape. When changing into a world of software-defined applications, a good foundation is essential. It should be agnostic of clouds (private or public) since interoperability needs to be guaranteed in case a business decision requires a migration to a new environment. Some content providers are already taking advantage of machine learning to reduce costs and increase efficiency. Traditionally this is not the domain of the broadcast or media specialists since this was handled by the vendors using embedded OS and Kubernetes in their hardware.

Enterprise IT in Media facilities

Now a new world is opening up and the software industry has also rapidly evolved to a service industry focussing on agile development, automation of updates, and security patches. Apart from the required people and scarce skills, it will introduce new risks when doing it yourself or pushing back the responsibility of maintaining infrastructure to the traditional media vendors. No one vendor will provide a complete media solution as a company-wide horizontal platform for all their media functions, simply because it is not within their scope of services. So each media company has to embrace interoperability in public and private clouds plus securing and managing the integration of media workflows; this will significantly reduce the complexity of achieving their goals.

Container Platforms as Foundation

Red Hat OpenShift is a powerful containerization platform, emerging as a transformative solution that enhances digital operations and enables media organizations to build secure, sustainable, and adaptable systems. OpenShift provides the infrastructure for containerized media workflows on-premise, in the public cloud, or at the network edge, allowing all media workflows from content production, management, preparation, and distribution to run seamlessly on one horizontal platform, making the entire network more efficient. Such a container platform serves as the foundation for these SMFs, enabling media organizations to design and deploy containerized media functions that can seamlessly integrate and communicate with each other. Whether it’s video production, post, encoding, transcoding, metadata extraction, or content delivery, Red Hat OpenShift provides the necessary infrastructure and tools to build a comprehensive ecosystem of media functions that work harmoniously together and allow for scalability.

Reduce complexity

Further, adopting a dynamic media factory layered model empowers media organizations to leverage well-established standard IT tools and frameworks, such as Kubernetes & Ansible. This approach allows the assembly of open platforms, enabling interoperability and scalability across different media functions agnostic of virtualized or physical location. By utilizing these standard tools, media companies can efficiently manage workflows, reduce complexity, increase security, and promote collaboration among teams, all while taking advantage of Red Hat OpenShift’s robust container orchestration capabilities.

Optimize and streamline

Media companies have a responsibility to minimize their environmental impact while scaling and improving efficiency. OpenShift provides media organizations with the flexibility to design customized workflows and user interfaces to implement a common control framework. The result is consistent governance and security across the entire media workflow, regardless of the deployment environment. By having a unified control system, teams can optimize operations, streamline processes, lower their carbon footprint, and enhance productivity, while maintaining compliance and data integrity. OpenShift provides tools and capabilities to measure resource usage, helping to monitor and manage their carbon footprint and optimize container deployments and efficient utilization of the network infrastructure.

Building Ecosystem with Partners

Red Hat plays a vital role in assisting clients with their transition to software-defined platforms – this has been done in Telco, Finance, Industry, and Governmental organizations. In the media industry, similar trends are being seen, therefore Red Hat is building an ecosystem of partners that support the vision of an Open Media Ecosystem with an enterprise-grade solution. Red Hat’s ecosystem of media partners is growing and brings together a diverse range of expertise and solutions tailored to the specific needs of media organizations. These partners have undergone rigorous validation and certification processes, ensuring their compatibility with OpenShift and guaranteeing the highest standards of performance, security, and reliability. By leveraging this ecosystem, clients can select and integrate solutions from a trusted network of media partners, enhancing their media workflows.

How can we help?

Red Hat’s container platform transforms the media industry by providing the infrastructure and tools necessary to build secure, scalable, and efficient media workflows across various deployment environments. Supported by a robust ecosystem of media partners, Red Hat empowers clients to transform their media workflows, enhance efficiency, and remain competitive in a dynamic and ever-changing media landscape. Clients can confidently transition to open-source-based, enterprise-grade, software-defined platforms, unlocking new levels of flexibility, security, and efficiency in their media workflows. Red Hat helps media organizations navigate the evolving digital landscape while addressing environmental concerns, driving carbon efficiency, and contributing to a sustainable future.

What can be done?

Create a company-wide vision on software deployments, and stimulate integral discussions between video teams, corporate IT, and your network teams to find out your best route to the future. Ask and challenge your vendors on open-source enterprise-grade container and automation platforms. Our teams are open to facilitating the move to software-defined platforms without creating dependency on the applications layer and can assist in taking the first steps.

Learn more about Red Hat OpenShift and Red Hat in the media and entertainment industry.