IABM Technology and Trends Roadmap™

NEWLY UPDATED APRIL 2024

Technology and Trends Roadmap content copyright IABM

The IABM Technology and Trends Roadmap isn’t just for industry technologists to use as a reference. IABM has discovered industry execs using it as a starting point for their keynote speeches: product line managers are using this to plot their own products; and corporate board members get a better understanding of where the companies’ products sit on the adoption curve, hence a better grasp or risks vs gross margins. This also assists marketing activities by giving an indication how best to promote products within M&E and also adjacent/vertical market areas.

This year’s update has seen a few major changes of the major technology and trends groupings as the IABM Roadmap working group felt best to portray the condition of the various aspects of the industry. As always this activity draws on strong industry collaboration between end-users, vendors and competitors alike hence it created a lot of discussion, debates and controversy, yet the final outcome is a remarkable example of teamwork.

Getting into the details, I like to start with security as it is super important and still is far too often neglected, not due to technology, but mainly on the implementation and budget sides. This year we moved from Security Workflows to Security Architectures as content security is well understood.

Starting with Cloud, last year we discovered that important trends such as microservices could become discounted, and additionally both the advantages/disadvantages of public/private off-prem cloud operations would not be highlighted if we had a single Cloud grouping, hence we broke it down into:

Cloud Services – off-prem (public or private)

Cloud Infrastructure – Virtualization (public, private, hybrid), Microservices

In the CTO breakout discussion at IABM’s annual conference, it was clear that Edge Computing is of primary importance as customers won’t be moving huge media content into and between clouds, have it processed and then moving it back again. The one qualification is when a sudden burst of compute power is required. This still fits well within Cloud Infrastructure. This being said, one of the debates the group had was do we keep Cloud or discreetly move the various functions of Cloud into each group individually? For example, playout in the cloud would go into the Delivery grouping. But where to put microservices? These are all areas that need serious consideration for both the business and the technical sides of operations.

Since RFP’s don’t specify remote production anymore, it is just expected and assumed to be the norm. The main point with remote productions is conductivity, whether Public Internet, 5G, etc, which is detailed in Transport & Networking. Whether it is remote or local productions, there are many aspects that do need attention; hence we generated a new group called Production.

Compute and Storage continues to have new areas such as more advanced GPU’s, carbon nanotubes, Thunderbolt 3/4, computational storage and quantum computing, much of which is really all about infrastructure, which could be on-prem or off-prem. The conclusion was to not focus on super bleeding edge areas such as quantum, as that won’t be used within our industry for a while.

Within our multiple Roadmap calls and emails, it become clear what the best way to deal with the term “cloud” was, and the conclusion was accurately quite simple. Because cloud itself takes on some many forms such as on-prem, off-prem, private, public etc,  we don’t need to focus typically on the type of cloud; the areas of interest are about Infrastructure (i.e. storage, edge, computing and networking) along with Services (i.e microservices and cloud).

Artificial Intelligence and Machine Learning are definitely taking center stage beyond the basics like re-use of archives, close captions/subtitles, and sports visual recognition that are constantly improving. The group decided best to focus on GenAI/ML with areas like responsible AI, Machine-to-Machine along with some of the newer uses of AI within the industry. Understanding training models, assuring appropriate licensing along with understanding what is real all comes under the grouping of Provenance.

With sustainability there is so much greenwashing going on, we decided to move towards Tangible Sustainability; hence can cover specifics (either happening now or planned) within each area of Create, Produce, Manage, Publish and Monetize.

With technology becoming readily available and less specialized for each industry, more market areas are cross-sharing products and services now, so we opened up a new category called Vertical/Adjacent Markets with the hope to harmonize and understand the different markets.

Production – (Remote/Hybrid/Local)

Remote productions continue to improve and are now typically hybrid productions. Both public and dedicated 5G (GPRS) networks have a growing usage for backhauls, IFBs, intercoms, etc. In-studio volumetric production is growing, yet still often requires a learning curve to understand how to match cameras with LED walls. Hybrid productions can playout directly avoiding many latency issues. Newer single unit multi-cams with AI lead the way for more automated sports productions. Camera direct-to-cloud capabilities are making production turn around faster.

Lower tier sports as well as in initial broadcasts from higher tier sports are early adopters of live Cloud Production (LCP). Some are going directly to air.

When using IP, some newer facilities consider this as early adopter, yet others consider it mature. The same can be said managing production.


Services – (Microservices/Cloud)

Services are no longer considered as cloud only as micro-services can be on-prem and off-prem. The move towards hybrid is clearly taking advantage of being able to scale up quickly as demand changes as well as keeping “golden” content local due to off-site concerns. Newer services such as AI learning, powered by public cloud providers, are gathering momentum.

Customers are recognizing the value of moving their content to the cloud. It becomes easier to process huge archives to enhance metadata and improve search using AI. Cloud also enables remote workflows and global distribution of content, transcending geographical boundaries. It removes the need for tape system maintenance, upgrade, tape version migration, and physical expansion. It also eases utilization of archives to generate personalized FAST channels.

The cost of getting data out of the cloud and moving between clouds to get the data closer to the service may shift the CFO’s opinions about wanting to use public cloud services more. Financial management of “pay as you go” cloud costs is a rising area as people start to tackle this currently.

Cloud playout services are mature. Ad insertion for traditional linear is mature, however still early adopter and quite complex for Dynamic Ad Substitution (DAS).

Please note, you will need to be a registered user of our website to access this page. Please login to view.

Infrastructure – (Storage/Edge Computing/Networks)

Storage speeds have dramatically sped up by having full saturation of multiple 100GbE front-end network links, maximizing the number of streams of high-bandwidth, low latency video within a scale-out NAS. Networks are becoming smarter by having DPUs and CPUs built into the NICs. Compute nodes, physical or virtual machines, are incorporating XDP, RDMA, and DPDK kernel bypass implementations, greatly increasing their bandwidth capability, enabling servers and VMs to receive/transmit live high bandwidth flows, such as JPEG XS and 2110. The ability to nearly saturate 100 Gb/s NICs creates great scaling opportunities in the cloud. Over time, cloud networks and on-prem networks will blur together. Kernel bypass implementations reduce latency, improve throughput, reduce compute requirements, and increase scale. Infrastructure-as-code along with hyperscalers and efficient cloud native code continues to replace dedicated hardware. Where network and conversion hardware is required, the additions of real time functions such as mixing or standards conversion are becoming directly integrated into the system. With AI taking a lead role, dedicated AI chips are dramatically improving the speed of learning models. GPU edge computing brings the compute closer to the source data, enhancing AI/ML models and lowering latency, so enhancing augmented reality applications. Reduced latency and more importantly, consistent latency (or fairness) is required for interactive gaming and gambling. 5G and edge computing together will enhance immersive tech applications, virtual reality and gaming applications; JPEG XS Low Latency is a key improvement for GCCG. The trend toward lower and lower egress charges from the hyperscalers makes JPEG XS and higher bandwidth flows more attractive.

Although there is still “lift and shift” happening, the benefits of re-architecture such as moving from Windows to Linux and into containers are moving forward. Time stamping rather than frame syncing to minimize latency is a prime example. Another example is code using a traditional file store versus native code to write to object storage. Native code does optimize both performance and cost, hence the re-tooling to go to cloud is slowing up adoption of cloud for various tasks. Re-tooling for cloud also requires security changes.


Gen AI/ML – (Responsible AI/Machine2Machine)

AI models are quite firm for applications such as speech-to-text, storage analysis/duplication, etc. Beyond this, Natural language processing (NLP) is taking aim at more voice-driven applications to tease out analytic data since NLP has the ability of a computer program to understand human language as it's spoken and written.

AI subscription management strategies are giving clearer insights into customer retention and loyalty and hopefully will improve churn rates. AI is simplifying News workflows by providing a camera-ready first draft and repurposed pieces for different outlets, which they can then edit, confirm sources and complete.

Newer custom AI chips are out. These chips are designed for a high volume of low precision computations hence requiring less power per cycle, which is exactly what is required for AI inferencing.

With AI, machines can learn how to optimize workflows by selecting the various micro-services based on speed, cost, quality and efficiency.

Since the trend of defining AI copyright drastically varies region-to-region globally, this will affect technology roll-outs much more than issues such as out-of-country cloud storage that was much cleaner to control. Note that Gen AI does hallucinate as well as making errors due to having unscrupulous training data; hence responsibilities for using Gen AI cannot be taken lightly.

Tools such as Sora (text-to-video) are in the bleeding edge of generating videos using AI. The effectiveness of having AI assistance tools is catching on quickly; a trend example is saving time selling ads and quickly filling unused spots. Also within the Ad tech space, AI ad generation and moderation, contextual advertising, and deriving audience insights is bleeding edge to early adopter.

Machine learning assistant and support engines that are used within broadcast facilities are quite mature.

AI creates artificial super-slo-mo versions of standard frame rate video, provides auto-clipping of sports events, enhances archive search, and can automatically translate for content localization. Media examples of AI assists are color corrections and synchronizing lip movement for dubs. Expect to see a huge amount of localization via Gen AI over the next year. Tests are underway having avatar News anchors; there is, however, a trust issue associated with this.


Provenance – (LLM/Blockchain/C2PA/Hallucinations)

Generative AI algorithms sort through existing data to create new content; hence the importance of knowing precisely the data that the Large Language Model (LLM) was trained on cannot be overstated. The Coalition for Content Provenance and Authenticity (C2PA.org) is using local blockchain to address the prevalence of misleading media online. This has the promise of having the ability to track and protect content provenance, for consumers, businesses and AI training.

A bleeding edge use of AI is to repurpose specific archive material (such as news clips) for use or reselling, not because of the technology; it is more about the rights because the provenance is often unknown.

In-camera digital signature technology signs images in real-time for use with C2PA. Without a reliable way to check where the Gen AI content comes from, there is a risk of ethical and legal problems.

Gen AI can present misleading or false information as fact: this is known as AI Hallucinations. Several analysts estimated that in 2023 chatbots hallucinated as much as 27% of the time, with factual errors present in 46% of their responses.

Hallucinations on image generated content can be considered as an “AI artifact”. The trend of using AI for content ideation always keeps a human in the loop and is being used as a brainstorming tool.


Immersive & Imaging – (8K/Audio/XR/QD-OLED)

Venues such as Cosm and Sphere take both immersive audio and video imaging to a completely new level for entertainment venues including personalized audio per seat. HDR productions are becoming more common. XR studios for news are rolling out quickly. There is hope that Apple Vision Pro will generate enough interest to become lower cost, generating new business models.  

Display technology is beyond bleeding edge with Quantum-Dot (QD-OLED) as well as transparent displays. The metameric effect of the narrow bandwidth laser projectors is causing debates due to the fact that older and younger viewers see different white points.

Please note, you will need to be a registered user of our website to access this page. Please login to view.

Secure Architectures – (Cyber, Zero Trust, Workflows)

Security Frameworks and Standards (such as Zero Trust) exist. However, they are still not being broadly adopted in the Broadcast domain despite the fact this technology is mature and well used on the IT side of media operations. We see both NMOS IS-10 and OAuth2 becoming common elements of RFP processes.

The crucial foundation on a go forward basis is that security should be built around each workflow rather than a focus on only content and data.


C2PA, a standard produced by the Coalition for Content Provenance and Authenticity to certify the source and history of media content, is rapidly gaining traction both as a tool to fight AI fakes and misinformation, and to internally track the sources of delivery content in production and archive workflows. C2PA is also a manifestation of blockchain, which continues to make advances into other areas of media and business management.


Migration to cloud at scale is dependent on architectures built using infrastructure as code, which describes functional parameters of storage, compute and networking. Patterns established, tested and passing security as IAC, can then be duplicated to run securely at scale, avoiding human errors that can be introduced in manual operations to construct these highly complex and large systems.

The security of Cloud Architectures that dynamically pull containers and Infrastructure as Code (IaC) from third party repositories must be considered and appropriately mitigated. For example, having dependence on the use of open source repositories, GitHub/BitBucket or similar, can have undetected flaws, open ports for testing that didn’t get removed, etc. - resulting in possible breaches.

CDN leaching continues to be a streaming content security issue. Some newer CDN architectures yield more effective and secure delivery but may not be scalable.


Contribution/Delivery – (Transport/5G/CDN/Public Data/BPS)

Large-scale public and dedicated 5G (GPRS) networks have a growing usage for backhauls, IFBs, intercoms, etc. 5G usage in large venues like stadiums to attendees has proven effective. 5G Standalone (SA) services with network slicing are starting to be rolled out, which streamline mobile TV production. Using the Internet for contribution feeds with various protected protocols is mature. JPEG XS is leading the way.

Provisioning distribution of multiple camera feeds from the same event to streamers in an automated fashion is incredibly challenging.

Streaming is quite mature, especially when archives are moved into the cloud, noting that live streaming works well but is still in early adopter stage. The same can be said for managing streams. Technology “best practices” can help the streaming industry improve profitability. Last mile to consumer, CDN leaching and scalability is still an issue.

ATSC 3.0 transmitters have “distribution of data as a service” to distribute public data and files. Broadcast Positioning System (BPS) is using ATSC 3.0 for geolocation services as an alternative (back-up) to GPS. Both are in early adopter stage.

Monetization efforts on the delivery side include converged TV ad sales in local and national markets; unified measurement solutions to enable alternate currencies; next-gen ad formats including virtual product placement and shoppable video; and self-service privacy-enhanced data collaboration.


Orchestration & Scalability – (Automation, Provisioning, QC, Observability)

As operations and virtual infrastructure become irreversibly dependent on highly complex automation, it has become clear that robust orchestration mechanisms are necessary to attain scale, efficiencies and maintain robust security. A bleeding edge example is the calling of Infrastructure as Code for the Architecture set-up as well as Network as a Service. Multi-cloud operations present an even larger challenge for automation, QC, and observability. Customers want cloud options, but without orchestration & scalability across hyperscalers, multi-cloud operations and disaster recovery are problematic. Orchestration can also call QC services based on monitoring by exception. Technicians have a common complaint of not being able to probe or observe inside functions to solve QC issues. Cloud networks are more opaque than on-prem networks. Lack of precise control of data and signal path flow in clouds exacerbate the observability problem. Standards around metrics, metadata, and logging are needed to optimize QC and Observability.

Resource management ties into infrastructure as code, that being the timing of deploying and managing resources to guarantee scalability.

Orchestration tools can help with sustainability by using more power efficient resources when applicable, even powering down systems when not in use.

If you would like to be involved in a consultation group around the roadmap then please click here and send us your details.

Tangible Sustainability – (Power/Cloud/Remotes)

Even though it is much talked about, sustainability is the least important factor driving investment in 2024 according to IABM research. Sustainability is much more important to specific types of buyers such as public broadcasters where it is the 3rd most important driver. There is still a misconception that moving to public clouds means there is an improvement in sustainability. Cost improvement and public pressure are the main drivers of sustainability. Carbon footprints are typically part of the RFP process.

The move to digital transmitters shows specific maturity in carbon reduction.

Red flags are popping up such as they did with crypto-currency mining: the building of massive AI-focused supercomputer data centers will require millions of chips and huge amounts of electricity. No doubt they will get used within media, which again points to the fact that being in the cloud doesn’t mean you are sustainable by using an “AI Factory”.

The “right-to-repair” laws are encouraging repair of electronic goods instead of replacement that leads to a reduction in electronic waste.


Vertical/Adjacent Markets – (Corp/Education/Government/Social)

Many of the businesses that use A/V equipment outside of M&E were typically only doing teleconferencing in the past. Corporations now have full-blown local and remote studios. Many are totally IP, which is exactly where IPMX is a great bridge between M&E and other market areas. From a vertical market point of view, Create products are mature, with merely few that are early adopter. Similarly with Produce; the early adopters have higher resolution productions. With Publish, there is a wide range: social is commodity as is straightforward playout. There are some unique aspects happening with newer live implementations. New monetization methods are using various AI undertakings from M&E, whereas promotion endeavors are undeniably mature as expected. This being said, new concepts are always proceeding to outstrip the competition.


Emerging Trends – (Digital Twins/Metaverse/Web3.0/Spatial)

Making a complete studio or movie set digital replica (digital twin) by using 3D laser scanners (LiDAR) improves both the speed of production changes as well as the safety. Although Accessibility isn’t new, the cost to implement is much lower pushing more accessibility options.

Spatial computing combines the physical world with a virtual one. We see this with the newer headsets that create immersive environments or overlays in physical ones. Technology like the Apple Vision Pro isn’t bleeding edge, however the limitation of cost for the general public and limited apps are holding it back from being mature. Massive layoffs from the tech giants are making the near future of the metaverse and Web 3.0 uncertain.

New techniques in having personalized storytelling and video highlights on the first screen are catching up with social media, hence transforming TV into a digital first experience which is a key area to engage GenZ audiences.


“One of the constantly moving challenges in our industry is not only keeping up with all the new technology trends, but also having a deep enough dive to understand what is truly relevant and not simply just a ‘fad’. Our industry pumps 25% of hard-earned revenue back into R&D - so every penny spent needs to be as close to a sure bet as possible. End-users also have a similar issue understanding which technology to bet on, what they can’t live without – the ‘must haves’ - and which will capture more market share for them, as well as providing a path of profitable growth. The Technology and Trends Roadmap will help address these fundamental business questions for everyone involved with broadcast and media technology"

Stan Moote, IABM CTO

Usage

IABM sees two main uses for creating and distributing a common industry roadmap:

  • Company Internal – A reference internally so both board members and executives have a common view of the industry. This gives the CTO a strong reference or ‘springboard’ to further add in their own company’s unique expertise to produce a company-specific roadmap which will typically map into company quarters/years. 
  • Company External – When presenting to customers and end-users, there is no starting point, no reference to begin. There is nothing worse than being in front of a customer having to waste half the meeting trying to explain your first graphic. Having a common industry roadmap may not provide 100% buy-in by every end-user, however it does give a common industry consensus that sets the stage for each vendor’s distinctive roadmap. This in turn becomes a key driver of strategic decision making for high-level plans that need to be articulated to end-users. IABM research shows how partnering is very important to growth and sales strategy.

Technology and Trends Roadmap content copyright IABM