Imagine Communications – The long road to Paris: the rise of remote production in broadcasts of the Games

Imagine Communications – The long road to Paris: the rise of remote production in broadcasts of the Games

John Mailhot, Senior Vice President, Product Management at Imagine Communications

With Paris 2024 just around the corner, I can’t help looking back on the 2010 Winter Games in Vancouver, where I had the privilege of working with one of Imagine Communications’ broadcast customers. In all the excitement, one thing that struck me the most was the sheer size and complexity of the International Broadcast Centre (IBC). Broadcasters from all around the world had packed the space with a massive amount of equipment — including multiple control rooms and dozens of full editing suites — and thousands of people were working tirelessly to produce their programs. It was a spectacle on par with the games themselves.

As in all televised Games before it, show production in Vancouver was performed on-site, with the finished product being sent back to each broadcaster’s home country. The reason for this was simple: telecom (or even Satellite) links at the time were more expensive than sending personnel and equipment to the host city. Over the last 14 years, however, there’s been a gradual shift towards a mix of local and remote production, with camera operators and commentators working on-site to cover the games, and some editing, graphics, and other finishing touches being applied in the studio at home.

This trend has become more pronounced with each biannual event and was accelerated dramatically by the pandemic. During the 2021 Summer Games in Tokyo, there was a significant push to increase remote production, with far more staff and equipment than ever before remaining at home. The following year, the Winter Games in Beijing provided a bellwether of how much production could be performed remotely and which aspects must be conducted on-site.

Today, as the world gets ready to watch the 2024 Summer Games, most broadcasters have applied these remote production technologies to all manner of sport content and figured out how to achieve the right immediacy and event feel by balancing local and remote production. So, while the IBC in Paris will no doubt remain an impressive, bustling hub of activity, it will be more about handing off signals from the pool to each broadcaster’s on-site team, and less of a site production compound than in years past.

Higher bandwidth at a lower cost

From a technological standpoint, there are three key factors that have brought the broadcast industry to this point where remote production is suitable even for such high-profile events. The first is the availability of bandwidth for transmitting signals. Compared to years ago, the amount of carrier bandwidth accessible to broadcasters has increased significantly, while the associated costs have become far more reasonable. Both of these factors are vital, because the bandwidth demands for remote/split production are significantly higher than simply forwarding home a finished show.

With on-site production, broadcasters really only need one link back to their home country for the finished channel — at most, two or three paths for redundancy. With a remote or split production process, however, they might be sending back 20 or 30 signals from various cameras, at production quality levels, which requires dozens of links between the sites. The increase in availability and the reduction in the price of bandwidth are key enablers of remote/split production.

Well-documented HDR workflows

The 2024 Paris Summer Games are expected to mark a significant milestone in live HDR broadcasting, with several world broadcasters planning to incorporate HDR productions to varying extents. This brings the second factor — well-documented HDR workflows — into play.

Until recently, broadcasters were limited to locally producing live events in HDR, as workflows required careful visual coordination between camera shaders and producers looking at the same monitor. That has changed over the last few years as broadcasters have developed and documented their HDR workflows across major events, including standardized LUTs for conversion and shader checking the Standard Dynamic Range (SDR). Today, these standardized workflows are capable of supporting local and mixed/remote production, including creating SDR work products of very high quality — a requirement for the all-important legacy distributions.

Reducing latency

Finally, the relatively new JPEG XS codec tackles the issue of latency, which has traditionally been a stumbling block in remote production, especially when it comes to communication between on-site camera operators and technical directors in the studio. With traditional codecs, it may take the director a few seconds or longer to see the result after they’ve asked the camera operator to adjust something, such as zooming in or panning left. This can lead to a frustrating and disjointed process that hinders cohesive team interaction.

By reducing the latency of the signals being transmitted between the sites, the entire team feels like they are working more naturally together. JPEG XS dramatically reduces latency to the bare minimum while maintaining production picture quality.

At Imagine Communications, many of our customers have found that the JPEG XS codec offers the ideal combination of high picture quality, ultra-low latency, with an 8:1 bandwidth savings over uncompressed — allowing them to achieve the look they want while enjoying the benefits of the remote/split production. So, with its support for JPEG XS alongside its complement of UHD and HDR conversion capabilities, our Selenio Network Processor (SNP) has become an integral part of their remote production workflows.

There are more than 3,500 SNP units actively deployed around the globe — for more than 100,000 video channels worth of video processing — and many of them will be on the ground this summer in Paris. It’s going to be a watershed event for remote production, and we are thrilled to be a part of it.

Skyline Communications – dataminer.MediaOps

Skyline Communications – dataminer.MediaOps

BaM Award® – Support

The revolutionary dataminer.MediaOps product seamlessly blends media technology and workflows with information and communication technology (ICT), introducing a new era of data-driven, automated, and simplified media operations. Central to its architecture is the concept of the digital twin of the media operation—housing all network statistics, metrics, counters and configurations, coupled with vital business information like event schedules, asset inventory, playlists, electronic program guide (EPG) data, and more. This digital twin allows for unprecedented opportunity for:

  1. Resource Planning and Scheduling
  2. Live Media Orchestration
  3. Automation of File and Asset Workflows

While dataminer.MediaOps encompasses the entire media operation, it allows each tenant—internal teams or external stakeholders like customers, contractors, network providers, rental companies, and reporters—to work independently within their designated area, while maintaining seamless harmony with other teams. Many users rely on dataminer.MediaOps for their specific needs:

  • Booking teams schedule resources, spanning staffing, satellite transponder slots, IP network capacity, and technical resources.
  • MCR and Tx room operators perform ad hoc and scheduled connection management, media processing controls, smart monitoring and redundancy switching.
  • Engineering teams design, automate, and test media workflows running on-premises, in the cloud, or hybrid.
  • Media asset teams automate asset and file workflows, from ingest to distribution, publication, and archiving.
  • IT and SecOps teams manage ICT infrastructure, automate security workflows, and track IP multicast flows.
  • Media and ICT cloud teams dynamically deploy and undeploy workloads on demand or according to the master event schedule.
  • Finance and procurement teams analyze resource utilization, costs, and generate billing records using dataminer.MediaOps.

Key features that set dataminer.MediaOps apart include:

  • Seamlessly integrating over 30 years of expertise in managing, orchestrating, and monitoring media networks with state-of-the-art ICT technology, recognizing the convergence of media and ICT domains.
  • Acknowledging the inherent connection between various processes by merging workflows for event planning, resource scheduling, live media operations, and file workflows.
  • Designed from the ground up to handle the complexity and scale of SMPTE ST 2110 while accommodating a variety of media transport protocols such as SRT, SDI, SMPTE ST 2022, ASI, and L-band, contrasting with traditional solutions limited to SDI-only bespoke products.
  • Interoperating with diverse media and ICT technology products, including media processors, gateways, IP fabrics, multi-viewers, intercoms, probes, SDI routers, and even cloud workloads, facilitates infrastructure-agnostic workflows.
  • Adhering to industry standards (NMOS) and bespoke products, ensuring seamless integration across diverse systems.
  • Integrating seamlessly with installed broadcast controllers, SDN controllers, IPAMs, CMDBs, ERP systems, and other systems, catering to both brownfield and greenfield operations.
  • Possessing resource awareness means enabling teams to request resources for ad hoc operations, planned media events, or maintenance without conflicts while offering comprehensive insights into resource utilization and costs.
  • Including SecOps automation to customize security workflows to specific policies and integrating with SIEM systems.
  • Promoting collaboration across various teams—from business planning to finance—through collaboration tools like Teams, Slack, ChatOps, and cloud sharing.
  • Embracing continuous change in media operations by leveraging platform innovations such as low-code applications, workflow designers, user-defined control surfaces, DataMiner Object Models, open scripting engines, CI/CD, user-defined APIs, etc.

In essence, dataminer.MediaOps revolutionizes media operations by combining modern ICT practices with deep media domain knowledge. With dataminer.MediaOps, M&E companies deliver better service quality and user experience, respond faster to business needs and technology innovations while increasing productivity and cost-effectiveness.

More information here.

In choosing dataminer.MediaOps as the Support BaM Award® winner, the judges said: “Groundbreaking integration of media and ICT, offering a comprehensive inventory management system that spans technology, operational resources, and more. Its ability to automate and orchestrate workflows represents a leap forward in efficiency and cost-effectiveness, challenging traditional operational silos…it helps ‘automate the automation’ by aggregating various disparate workflows from different sources.”

Perifery – Intelligent Content Engine

Perifery – Intelligent Content Engine

BaM Award® – Manage

Perifery’s Intelligent Content Engine (ICE) is a software platform that leverages AI agents and advanced AI models to manage, organize, and curate media content such as images, videos, audio files, documents, and other multimedia assets. Acting as an AI Media Content Librarian, ICE examines, understands, and catalogs every file within its view. It automatically categorizes, organizes, and understands media assets based on the content itself regardless of the existence of any traditional metadata.

ICE’s simple conversational interface enables users to find and retrieve media files through simple natural language requests. Results are found based on having an understanding of the content as well as a record of all of the objects, scenes, people, or text found within the image or video. This intelligent content recognition technique helps users quickly find relevant assets based on content rather than relying on metadata. It improves searchability and enhances the accuracy of content recommendations. Users can also perform complex searches using natural language queries, filters, and contextual information. Clear justifications accompany search results, ensuring users understand why each piece of content matches their query.

ICE understands content – it facilitates content lifecycle management and efficient resource utilization. It works autonomously, going beyond media asset management  addressing the challenges of increasing volume and complexity of digital media assets. It also empowers organizations to unlock the full potential of their media assets, enhance collaboration, and accelerate content creation and distribution.

ICE is transcending media asset management, making the process of finding content more intelligent, efficient, and adaptive to the evolving needs of organizations managing vast amounts of digital content.

In choosing ICE as the Manage BaM Award® winner, the judges said: “This entry stands out as the most pioneering and all-encompassing of the year. Offering flawless integration with both S3-compatible cloud services and on-site content, it streamlines workflow and automates the generation of metadata. This revolutionizes the way files are searched, shifting the focus to the content itself, and skilfully tackles key obstacles in media content management through its sophisticated, user-focused design.”

Quickplay – Media Companion

Quickplay – Media Companion

BaM Award® – Consume

Traditional Media & Entertainment content searches often disappoint users due to inaccurate results, limited personalization, keyword reliance, time wastage, and navigation complexity. We’ve all experienced it – investing nearly an hour or more just trying to figure out what to watch next. The challenge of content discovery for viewers has become greater in the age of abundant streaming platforms and vast content libraries. As the number of available options has grown, users often find themselves overwhelmed and may abandon the search before finding something suitable to watch.

Quickplay is enhancing the OTT industry by simplifying the complexities of content discovery. Introducing our Media Companion, which uses generative AI to revolutionize the content discovery experience. Partnering with Google Cloud, Quickplay has integrated Large Language Models (LLMs) with our award-winning CMS, creating a conversational interface that simplifies finding content. Quickplay Media Companion delivers personalized recommendations, detailed movie insights, and engaging interactive quizzes, ultimately speeding up the connection between viewers and the right content.

The Quickplay Media Companion, powered by Google Generative AI, is an intelligent software assistant designed to enhance your entertainment experience.

Media Companion excels at tasks like:

  • Finding movies
  • Exploring genres
  • Detailed information about films and actors
  • Serving up recommendations through:
    • Situational information (i.e. location, weather etc.)
    • Personality Quiz (i.e. content pre-defined personas)

The industry needs advanced content discovery search tools and AI to expedite search and discovery and drive viewer satisfaction. Large Language Models (LLMs) emerge as a solution, offering natural language interpretation, contextual understanding, semantic grasp, conversational interfaces, reduced search time, personalized recommendations, and cross-platform discovery.

However, standalone LLMs face challenges such as lack of personalization and difficulty in adapting to streaming platform constraints. To overcome these, Quickplay and Google Cloud  created a hybrid approach combining LLMs with Quickplay’s cloud-native Content Management System (CMS). This integrated solution leverages LLMs’ language capabilities and CMS rule management for a tailored, compliant, and efficient search and discovery experience.

Quickplay’s Media Companion enhances the user’s streaming interaction. It excels in locating content, exploring genres, providing detailed insights, and generating recommendations. It learns and adapts from user interactions, incorporates situational analysis for suggestions, and offers interactive quizzes for personalized content curation.

Within the Media Companion ecosystem, the CMS acts as a central content repository, managing metadata, business rules, and content restrictions. The companion API layer and workflow agent enhance real-time, conversational interactions by orchestrating API calls and maintaining session continuity.

Quickplay Media Companion addresses the limitations of traditional search mechanisms, presenting a practical and powerful method for personalized, and seamless content discovery in the evolving streaming media landscape.

In choosing Quickplay Media Companion as the Consume BaM Award® winner, the judges said: “With the plethora of content available, an intelligent, AI-driven search capability will be very much welcomed by consumers.”

 

BBC and Neutral Wireless

BBC and Neutral Wireless

BaM Award® – Project, Collaboration or event

At this year’s NAB Show in Las Vegas, the IABM BaM Award for “Best Project, collaboration, event or other innovation” was presented to an ambitious project led by broadcaster The BBC and Scottish software-defined radio company Neutral Wireless. The project deployed the “world’s largest pop-up multi-cell private 5G network” outside Buckingham Palace and along The Mall to Admiralty Arch to support international live media contributions during the Coronation of Their Majesties King Charles III and Queen Camilla.

Wireless camera feeds are an integral source of video content for programme making. Remote contributions regularly use bonded-cellular video encoders that encode and distribute a high-definition video feed across multiple public mobile networks, reducing the required resources and adapting to network conditions on each path. In high demand density environments with many devices all competing for resources, the public networks can become saturated and unable to sustain the required bitrates. This was experienced by The BBC and other broadcasters during the funeral of H.M. Queen Elizabeth II, preventing them from broadcasting their planned content, and was anticipated to be a problem for the Coronation.

BBC News and BBC R&D approached Neutral Wireless to explore the possibility of deploying a network to support remote video and radio contribution workflows during the Coronation. Initial viability testing was performed with LiveU on the banks of Loch Lomond in Scotland, before onsite tests and RF surveys outside Buckingham Palace. BBC recognised that other broadcasters would face the same connectivity issues and opened the network to a coalition of domestic and international broadcasters.

When Coronation Day arrived and the public networks became congested by 100,000s of spectators, the 7-cell private network deployed by Neutral Wireless supported over 60 devices from 20 international broadcast outlets with over 1 Gbps of uplink connectivity. News teams were able to go live when they otherwise could not, and with no change to their workflow.

In addition, a second single-cell network configured for low latency was also deployed outside the Palace to support low latency UHD testing by BBC R&D and Sony.

At the heart of the project was a Neutral Wireless Private 5G Standalone Network operating in n77 under a shared access licence from the UK regulator, Ofcom. The Lomond “Network-in-a-box” (NIB) provides an entire single-cell pop-up private 5G network in a portable 4U wheeled case, and has been deployed at sporting events and at Edinburgh Airport for the final departure of H.M. Queen Elizabeth II from Scotland.

For larger, more flexible networks, the company offers disaggregated systems with multiple cells and configurations. The system deployed for the Coronation featured a 5G core and 4 gNBs (including redundancy) to drive 7 radio-heads with high-gain sector antennas. The flexible software-defined system and intuitive user interface allowed for rapid reconfiguration, including uplink-bias to provide increased uplink capacity required by broadcasters to support multiple (U)HD video feeds.

The ambitious and innovative project was aided and supported by various manufacturers including LiveU, Haivision, Sony, Amarisoft, AW2S, and Alpha Wireless, and has now been recognised by numerous broadcast awards, including an International Broadcasting Convention (IBC) Innovation Award for Content Creation, a Technical Paper Award, and, also at NAB 2024, the BEIT Conference “Best Paper” Award.

In choosing the BBC and Neutral wireless for the Project, Collaboration or Event BaM Award® winner, the judges said: “This project demonstrated the very best in technical collaboration and problem solving to meet the needs of broadcasters to solve a very real problem around access to reliable spectrum and bandwidth for important events. The detailed planning and proof of concept work was a great foundation for the live use of the technology in a real environment…a future pathway to a very agile approach in deploying non-public networks for events such as this.”

Norsk low-code SDK – making live easy

Norsk low-code SDK – making live easy

BaM Award® – Produce

If you want to explain a live video workflow to another human being, it’s easy: “We need three SRT cameras that I can control with a video switcher, with an automated fallback to slate if all are offline. We’ll run a news ticker in the lower third and a logo in the upper right. We’ll need English and Spanish subtitles and audio tracks, and we’ll send a WebRTC output to one CDN and a CMAF output to another.”

Now try to create that same workflow on a computer. Go ahead, we’ll wait.

Chances are, it would take seasoned video engineers thousands upon thousands of lines of code and many months to build something that we believe should be straightforward. And that’s where Norsk Media Server comes in. It’s built on many thousands of lines of complex code, but comes with a TypeScript SDK (though it works with any gRPC-compliant language) that empowers developers to build that kind of workflow in hundreds, rather than hundreds of thousands, of lines of code.

And that’s just one example of what you can build. Norsk is suitable for any and all live streaming workflows, from simple “source in, ABR ladder out” to highly complex systems such as live sports featuring picture-in-picture and player statistics, Zoom-like video conferencing applications, or live concerts featuring multiple audio experiences that can be customized by the viewer. Unlike off-the-shelf tools that are limited by what’s available out of the box, Norsk is limited only by your vision, freeing you to build the exact viewer experience your audience wants and deserves.

In addition to flexibility and efficiency, Norsk also offers unparalleled reliability. id3as, the company behind Norsk, built its reputation creating custom solutions that were “good on a bad day,” and we’ve built that dependability into Norsk. Rule #1 of live events is that things do go wrong, but Norsk is built with failsafes to ensure that if one process or element of the workflow fails, the rest of the workflow stays running. For instance, it’s easy to build in custom logic so that if a camera feed drops out, the system will automatically switch to another camera or even a slate. Norsk also features an optional Infrastructure Manager component so that if, say, the cloud region or server on which an event is running fails, Norsk seamlessly moves the entire workflow to another location.

In addition to the low-code SDK that won the IABM BaM Produce award, we now have  Norsk Studio, a no-code, drag-and-drop interface that empowers programmers and non-programmers alike to build workflows using predefined components (such as “SRT Input,” “Camera Switcher,” “CMAF Output” etc.) that can be extended by developers to suit customers’ precise requirements. Further still, developers can build their own Studio components using the SDK, with the end result being a completely custom version of Studio tailored to their exact needs.

In choosing Norsk low-code SDK as the Produce BaM Award® winner, the judges said: “This is a very compelling approach to handling video in an IP/SDK environment…this product will make a huge difference to those facilities where ideas for a streaming channel would be difficult to afford or to explain to a vendor to create. This toolkit is well thought through and uses a common approach and terminology – such that operational people would be able to come up with and design viable workflows.”

 

Broadpeak Click2 – interactive advertising brought to Streaming

Broadpeak Click2 – interactive advertising brought to Streaming

BaM Award® – Monetize

In the evolving video streaming market, targeted advertising has become a critical component for monetization of OTT streaming services. As the demand for more engaging and effective advertising solutions grows, Click2, Broadpeak’s unique new interactive advertising feature, offers a unique proposition that caters to the needs of advertisers, video service providers, and viewers alike. Click2 opens a new type of ad inventory for video streaming service providers, helping them increase viewer engagement, boost monetization opportunities, and stay ahead of competitors.

The feature enables viewers to interact with ads as they consume video content via streaming, boosting their engagement with the content they watch — and their overall experience with that particular service provider. Click2 allows viewers to click on banner ads within the video ad stream, which then sends them a notification on their phone or other mobile device with information about the product featured in the ad and an opportunity to purchase it. This provides a premium user experience, as viewers can continue to watch video content while simultaneously looking at branded products on their companion device. This is crucial for customer retention, as irrelevant or disruptive ads can take away from the content end users are trying to watch, causing them to cancel subscriptions or turn to a platform’s competitor.

The solution is a win-win for end users, advertisers and video service providers, as it opens up a way to measure conversion in a Connected TV world.  The technology is compatible with streaming TV devices, including CTV and SmartTV, and is based on the IAB SIMID standard. This ensures broad compatibility and adherence to industry standards, facilitating easier adoption.

By creating clickable video streaming ads, Click2 guarantees that advertisers will receive engagement from end users, without any work needed from their side; video service providers can simply leverage existing standard instream ads to generate interactive ads. This also allows video service providers to start charging for ads per click, instead of or on top of per impression. In addition, Click2 provides advertisers with performance indicators about ads, not just impressions, thus helping them increase the value of targeted advertising; they can review the ads they are putting out and make sure they’re targeting audiences with the best, most relevant options.

Click2 is a feature of Broadpeak’s Dynamic Ad Insertion solution, available as a service on the company’s SaaS platform broadpeak.io.

In choosing Click2 as the Monetize BaM Award® winner, the judges said: “An innovative application that drives viewers into a more shoppable experience…there is huge room for growth in this area… more intuitive, faster and easier to interact with and incorporating not only advertising products but potentially recognizing product placement. This is an interesting evolution in the interactive advertising arena.”

Elevating the game: Star Media Production Services’ cutting-edge solutions in global golf broadcasting

Elevating the game: Star Media Production Services’ cutting-edge solutions in global golf broadcasting

During the period from October to November 2023, the international technical team of Star Media Production Services provided professional technical services for four of the world’s top golf events in just five weeks. These four events included the PGA Japan ZOZO Golf Open with a total prize pool exceeding $300 million, the ASIA TOUR Macau Golf Open with a total prize pool exceeding $40 million, the Shenzhen Volvo China Open, and the Hong Kong Open. These overseas golf event services spanned four different locations: Japan, Macau, Shenzhen, and Hong Kong. Utilizing three sets of advanced broadcasting equipment, Star Media’s technical team delivered comprehensive technical support, showcasing its robust technical capabilities, logistical expertise, and equipment supply prowess. We provided top-notch broadcasting technical services and production experiences for clients both domestically and internationally, earning unanimous praise from industry peers.

Efficient and high-quality international event solutions

While we’ve had excellent signal production experience in domestic major sports events like the Hangzhou Asian Games and the Chengdu Universiade, the production services for home events are relatively straightforward, with fewer logistical challenges compared to overseas events. However, overseas sports event technical services involve more international logistics transportation and venue equipment scheduling issues, requiring flexible and innovative production plans and processes. In recent years, Star Media’s sports event technical team has developed efficient and flexible overseas event solutions through continuous refinement, team collaboration, and the accumulation of mature experience.

01 Overseas event equipment configuration and signal production plan

Star Media’s technical team meticulously planned and executed the entire process, ensuring the smooth and secure conduct of each overseas competition and earning high recognition from the PGA and Asian Tours production companies.

For instance, in the PAG Japan ZOZO Championship, the event was meticulously planned by Star Media’s technical team, covering equipment deployment and professional signal production technology services for the entire process. We equipped 15 wired channels and 4 sets of wireless channels, including the setup of three groups of receiving points and the laying of 46 kilometers of camera fiber optic cables. In addition, special camera positions were added, utilizing top tracer ball trajectory tracking camera technology to enhance the viewing experience. In terms of signal production, we provided over 30 unilateral signals to Asahi TV, offering a comprehensive setup for program live signal one-stop service.

In the two events of the Asian Tour in Macau and Hong Kong, we respectively equipped 10 wired channels and 3 wireless channels, laying nearly 35 kilometers of camera fiber optic cables and setting up two wireless microwave reception points to ensure the integrity of video signal transmission and control production throughout the event. Furthermore, to ensure smooth collaboration and a good experience for the international production team, we purchased a video matrix for golf events to ensure flexible and reliable signal scheduling. Communication matrices, intercoms, and intercom base stations were also set up to facilitate smooth communication inside and outside the event venue.

02 Adopting high-definition EFP onboard systems

Considering the limitations of international venue equipment scheduling, Star Media’s sports event technical team used a typical cost-effective “EFP series product” in the PAG Japan ZOZO and Asian Tour Macau and Hong Kong stations. This product features TALLY and wireless transmission, conforming to the characteristics of international golf event broadcasting production systems. For the Asian Tour-Shenzhen Volvo Golf Open, which is held in China, we deployed our 4K/8K IP OB6 broadcast vehicle as the main director, main audio area, engineering area, and VT Edit area. Additional equipment from the Macau event was used to build the second production area under the vehicle, ensuring the integrity of the production.

03 High-quality audio production technology plan

Star Media Production Services tailored a visual and auditory feast for each different event. For the PAG Japan ZOZO event, 416 microphones were equipped for all wired and wireless channels, using high-quality windproof materials and cables to ensure pure sound quality. In terms of wireless venue commentary microphones, high-power amplifiers were installed on the original equipment, and a second receiving point was set up, paired with a talkback system, to ensure the timeliness and accuracy of venue commentator commentary. Combined with audio 4G and 5G network backpacks, efficient communication between the venue and OB area was achieved.

In the Volvo China Open event system, a half-IP and half-baseband format was adopted for production and transmission, and the DANTE system was used for audio distribution and transmission, allowing the main production area and the second production area’s mixing consoles to play the greatest role. In addition, ENG cameras equipped with different lenses were deployed to be ready at any time for “Hole In One” by the players, ensuring the recording and editing of VT Edit, making the production more exciting.

04 Ensuring the safety and efficiency of international equipment transportation

Star Media’s technical team transported the signal production equipment to Japan a month in advance, making full preparations for the PAG Japan ZOZO station of the PGA Tour. During the Asian Tour, we coordinated reasonably and orderly, fully arranging equipment facilities at the Macau Golf Open, Hong Kong Golf Open, and Shenzhen Volvo Golf Open. During the equipment transportation process, we maintained close communication and coordination with local partners to ensure the safety, timeliness, and integrity of the equipment’s arrival at the event site.

05 Perfect logistics system

Star Media’s technical team fully considered and arranged for the technical staff to provide comfortable accommodation, dining, and transportation, creating a good working environment. When encountering emergencies or challenges, the technical team responded quickly and flexibly, ensuring the smooth progress of the event.

In summary, Star Media has excellently and perfectly completed multiple top-level golf event technical services in a short period of time with its strong technical strength and service level. We will continue to adhere to the concept of “professionalism, innovation, and service,” expecting to provide more high-quality, efficient, and safe technical experiences for more clients with international event service needs, creating more exciting and high-quality event scenes for global audiences, and making greater contributions to the vigorous development of international sports events.

 

 

IABM Technology and Trends Roadmap

IABM Technology and Trends Roadmap

Stan Moote, CTO, IABM

The IABM Technology and Trends Roadmap isn’t just for industry technologists to use as a reference. IABM has discovered industry execs using it as a starting point for their keynote speeches: product line managers are using this to plot their own products; and corporate board members get a better understanding of where the companies’ products sit on the adoption curve, hence a better grasp or risks vs gross margins. This also assists marketing activities by giving an indication how best to promote products within M&E and also adjacent/vertical market areas.

This year’s update has seen a few major changes of the major technology and trends groupings as the IABM Roadmap working group felt best to portray the condition of the various aspects of the industry. As always this activity draws on strong industry collaboration between end-users, vendors and competitors alike hence it created a lot of discussion, debates and controversy, yet the final outcome is a remarkable example of teamwork.

You can check out the 2024 Roadmap in detail here.

Getting into the details, I like to start with security as it is super important and still is far too often neglected, not due to technology, but mainly on the implementation and budget sides. This year we moved from Security Workflows to Security Architectures as content security is well understood.

Starting with Cloud, last year we discovered that important trends such as microservices could become discounted, and additionally both the advantages/disadvantages of public/private off-prem cloud operations would not be highlighted if we had a single Cloud grouping, hence we broke it down into:

Cloud Services – off-prem (public or private)

Cloud Infrastructure – Virtualization (public, private, hybrid), Microservices

In the CTO breakout discussion at IABM’s annual conference, it was clear that Edge Computing is of primary importance as customers won’t be moving huge media content into and between clouds, have it processed and then moving it back again. The one qualification is when a sudden burst of compute power is required. This still fits well within Cloud Infrastructure. This being said, one of the debates the group had was do we keep Cloud or discreetly move the various functions of Cloud into each group individually? For example, playout in the cloud would go into the Delivery grouping. But where to put microservices? These are all areas that need serious consideration for both the business and the technical sides of operations.

Since RFP’s don’t specify remote production anymore, it is just expected and assumed to be the norm. The main point with remote productions is conductivity, whether Public Internet, 5G, etc, which is detailed in Transport & Networking. Whether it is remote or local productions, there are many aspects that do need attention; hence we generated a new group called Production.

Compute and Storage continues to have new areas such as more advanced GPU’s, carbon nanotubes, Thunderbolt 3/4, computational storage and quantum computing, much of which is really all about infrastructure, which could be on-prem or off-prem. The conclusion was to not focus on super bleeding edge areas such as quantum, as that won’t be used within our industry for a while.

Within our multiple Roadmap calls and emails, it become clear what the best way to deal with the term “cloud” was, and the conclusion was accurately quite simple. Because cloud itself takes on some many forms such as on-prem, off-prem, private, public etc, we don’t need to focus typically on the type of cloud; the areas of interest are about Infrastructure (i.e. storage, edge, computing and networking) along with Services (i.e microservices and cloud).

Artificial Intelligence and Machine Learning are definitely taking center stage beyond the basics like re-use of archives, close captions/subtitles, and sports visual recognition that are constantly improving. The group decided best to focus on GenAI/ML with areas like responsible AI, Machine-to-Machine along with some of the newer uses of AI within the industry. Understanding training models, assuring appropriate licensing along with understanding what is real all comes under the grouping of Provenance.

With sustainability there is so much greenwashing going on, we decided to move towards Tangible Sustainability; hence can cover specifics (either happening now or planned) within each area of Create, Produce, Manage, Publish and Monetize.

With technology becoming readily available and less specialized for each industry, more market areas are cross-sharing products and services now, so we opened up a new category called Vertical/Adjacent Markets with the hope to harmonize and understand the different markets.

Production – (Remote/Hybrid/Local)

Remote productions continue to improve and are now typically hybrid productions. Both public and dedicated 5G (GPRS) networks have a growing usage for backhauls, IFBs, intercoms, etc. In-studio volumetric production is growing, yet still often requires a learning curve to understand how to match cameras with LED walls. Hybrid productions can playout directly avoiding many latency issues. Newer single unit multi-cams with AI lead the way for more automated sports productions. Camera direct-to-cloud capabilities are making production turn around faster.

Lower tier sports as well as in initial broadcasts from higher tier sports are early adopters of live Cloud Production (LCP). Some are going directly to air.

When using IP, some newer facilities consider this as early adopter, yet others consider it mature. The same can be said managing production.

Services – (Microservices/Cloud)

Customers are recognizing the value of moving their content to the cloud. It becomes easier to process huge archives to enhance metadata and improve search using AI. Cloud also enables remote workflows and global distribution of content, transcending geographical boundaries. It removes the need for tape system maintenance, upgrade, tape version migration, and physical expansion. It also eases utilization of archives to generate personalized FAST channels.

The cost of getting data out of the cloud and moving between clouds to get the data closer to the service may shift the CFO’s opinions about wanting to use public cloud services more. Financial management of “pay as you go” cloud costs is a rising area as people start to tackle this currently.

Cloud playout services are mature. Ad insertion for traditional linear is mature, however still early adopter and quite complex for Dynamic Ad Substitution (DAS).

Infrastructure – (Storage/Edge Computing/Networks)

Storage speeds have dramatically sped up by having full saturation of multiple 100GbE front-end network links, maximizing the number of streams of high-bandwidth, low latency video within a scale-out NAS. Networks are becoming smarter by having DPUs and CPUs built into the NICs. Compute nodes, physical or virtual machines, are incorporating XDP, RDMA, and DPDK kernel bypass implementations, greatly increasing their bandwidth capability, enabling servers and VMs to receive/transmit live high bandwidth flows, such as JPEG XS and 2110. The ability to nearly saturate 100 Gb/s NICs creates great scaling opportunities in the cloud. Over time, cloud networks and on-prem networks will blur together. Kernel bypass implementations reduce latency, improve throughput, reduce compute requirements, and increase scale. Infrastructure-as-code along with hyperscalers and efficient cloud native code continues to replace dedicated hardware. Where network and conversion hardware is required, the additions of real time functions such as mixing or standards conversion are becoming directly integrated into the system. With AI taking a lead role, dedicated AI chips are dramatically improving the speed of learning models. GPU edge computing brings the compute closer to the source data, enhancing AI/ML models and lowering latency, so enhancing augmented reality applications. Reduced latency and more importantly, consistent latency (or fairness) is required for interactive gaming and gambling. 5G and edge computing together will enhance immersive tech applications, virtual reality and gaming applications; JPEG XS Low Latency is a key improvement for GCCG. The trend toward lower and lower egress charges from the hyperscalers makes JPEG XS and higher bandwidth flows more attractive.

Although there is still “lift and shift” happening, the benefits of re-architecture such as moving from Windows to Linux and into containers are moving forward. Time stamping rather than frame syncing to minimize latency is a prime example. Another example is code using a traditional file store versus native code to write to object storage. Native code does optimize both performance and cost, hence the re-tooling to go to cloud is slowing up adoption of cloud for various tasks. Re-tooling for cloud also requires security changes.

Gen AI/ML – (Responsible AI/Machine2Machine)

AI models are quite firm for applications such as speech-to-text, storage analysis/duplication, etc. Beyond this, Natural language processing (NLP) is taking aim at more voice-driven applications to tease out analytic data since NLP has the ability of a computer program to understand human language as it’s spoken and written.

AI subscription management strategies are giving clearer insights into customer retention and loyalty and hopefully will improve churn rates. AI is simplifying News workflows by providing a camera-ready first draft and repurposed pieces for different outlets, which they can then edit, confirm sources and complete.

Newer custom AI chips are out. These chips are designed for a high volume of low precision computations hence requiring less power per cycle, which is exactly what is required for AI inferencing. With AI, machines can learn how to optimize workflows by selecting the various micro-services based on speed, cost, quality and efficiency. Since the trend of defining AI copyright drastically varies region-to-region globally, this will affect technology roll-outs much more than issues such as out-of-country cloud storage that was much cleaner to control. Note that Gen AI does hallucinate as well as making errors due to having unscrupulous training data; hence responsibilities for using Gen AI cannot be taken lightly.

Tools such as Sora (text-to-video) are in the bleeding edge of generating videos using AI. The effectiveness of having AI assistance tools is catching on quickly; a trend example is saving time selling ads and quickly filling unused spots. Also within the Ad tech space, AI ad generation and moderation, contextual advertising, and deriving audience insights is bleeding edge to early adopter.

Machine learning assistant and support engines that are used within broadcast facilities are quite mature.

AI creates artificial super-slo-mo versions of standard frame rate video, provides auto-clipping of sports events, enhances archive search, and can automatically translate for content localization. Media examples of AI assists are color corrections and synchronizing lip movement for dubs. Expect to see a huge amount of localization via Gen AI over the next year. Tests are underway having avatar News anchors; there is, however, a trust issue associated with this.

Provenance – (LLM/Blockchain/C2PA/Hallucinations)

Generative AI algorithms sort through existing data to create new content; hence the importance of knowing precisely the data that the Large Language Model (LLM) was trained on cannot be overstated. The Coalition for Content Provenance and Authenticity (C2PA.org) is using local blockchain to address the prevalence of misleading media online. This has the promise of having the ability to track and protect content provenance, for consumers, businesses and AI training.

A bleeding edge use of AI is to repurpose specific archive material (such as news clips) for use or reselling, not because of the technology; it is more about the rights because the provenance is often unknown.

In-camera digital signature technology signs images in real-time for use with C2PA. Without a reliable way to check where the Gen AI content comes from, there is a risk of ethical and legal problems.

Gen AI can present misleading or false information as fact: this is known as AI Hallucinations. Several analysts estimated that in 2023 chatbots hallucinated as much as 27% of the time, with factual errors present in 46% of their responses.

Hallucinations on image generated content can be considered as an “AI artifact”. The trend of using AI for content ideation always keeps a human in the loop and is being used as a brainstorming tool.

Immersive & Imaging – (8K/Audio/XR/QD-OLED)

Venues such as Cosm and Sphere take both immersive audio and video imaging to a completely new level for entertainment venues including personalized audio per seat. HDR productions are becoming more common. XR studios for news are rolling out quickly. There is hope that Apple Vision Pro will generate enough interest to become lower cost, generating new business models.

Display technology is beyond bleeding edge with Quantum-Dot (QD-OLED) as well as transparent displays. The metameric effect of the narrow bandwidth laser projectors is causing debates due to the fact that older and younger viewers see different white points.

Secure Architectures – (Cyber, Zero Trust, Workflows)

Security Frameworks and Standards (such as Zero Trust) exist. However, they are still not being broadly adopted in the Broadcast domain despite the fact this technology is mature and well used on the IT side of media operations. We see both NMOS IS-10 and OAuth2 becoming common elements of RFP processes.

The crucial foundation on a go forward basis is that security should be built around each workflow rather than a focus on only content and data.

C2PA, a standard produced by the Coalition for Content Provenance and Authenticity to certify the source and history of media content, is rapidly gaining traction both as a tool to fight AI fakes and misinformation, and to internally track the sources of delivery content in production and archive workflows. C2PA is also a manifestation of blockchain, which continues to make advances into other areas of media and business management.

Migration to cloud at scale is dependent on architectures built using infrastructure as code, which describes functional parameters of storage, compute and networking. Patterns established, tested and passing security as IAC, can then be duplicated to run securely at scale, avoiding human errors that can be introduced in manual operations to construct these highly complex and large systems.

The security of Cloud Architectures that dynamically pull containers and Infrastructure as Code (IaC) from third party repositories must be considered and appropriately mitigated. For example, having dependence on the use of open source repositories, GitHub/BitBucket or similar, can have undetected flaws, open ports for testing that didn’t get removed, etc. – resulting in possible breaches.

CDN leaching continues to be a streaming content security issue. Some newer CDN architectures yield more effective and secure delivery but may not be scalable.

Contribution/Delivery – (Transport/5G/CDN/Public Data/BPS)

Large-scale public and dedicated 5G (GPRS) networks have a growing usage for backhauls, IFBs, intercoms, etc. 5G usage in large venues like stadiums to attendees has proven effective. 5G Standalone (SA) services with network slicing are starting to be rolled out, which streamline mobile TV production. Using the Internet for contribution feeds with various protected protocols is mature. JPEG XS is leading the way.

Provisioning distribution of multiple camera feeds from the same event to streamers in an automated fashion is incredibly challenging.

Streaming is quite mature, especially when archives are moved into the cloud, noting that live streaming works well but is still in early adopter stage. The same can be said for managing streams. Technology “best practices” can help the streaming industry improve profitability. Last mile to consumer, CDN leaching and scalability is still an issue.

ATSC 3.0 transmitters have “distribution of data as a service” to distribute public data and files. Broadcast Positioning System (BPS) is using ATSC 3.0 for geolocation services as an alternative (back-up) to GPS. Both are in early adopter stage.

Monetization efforts on the delivery side include converged TV ad sales in local and national markets; unified measurement solutions to enable alternate currencies; next-gen ad formats including virtual product placement and shoppable video; and self-service privacy-enhanced data collaboration.

Orchestration & Scalability – (Automation, Provisioning, QC, Observability)

As operations and virtual infrastructure become irreversibly dependent on highly complex automation, it has become clear that robust orchestration mechanisms are necessary to attain scale, efficiencies and maintain robust security. A bleeding edge example is the calling of Infrastructure as Code for the Architecture set-up as well as Network as a Service. Multi-cloud operations present an even larger challenge for automation, QC, and observability. Customers want cloud options, but without orchestration & scalability across hyperscalers, multi-cloud operations and disaster recovery are problematic. Orchestration can also call QC services based on monitoring by exception. Technicians have a common complaint of not being able to probe or observe inside functions to solve QC issues. Cloud networks are more opaque than on-prem networks. Lack of precise control of data and signal path flow in clouds exacerbate the observability problem. Standards around metrics, metadata, and logging are needed to optimize QC and Observability.

Resource management ties into infrastructure as code, that being the timing of deploying and managing resources to guarantee scalability.

Orchestration tools can help with sustainability by using more power efficient resources when applicable, even powering down systems when not in use.

Tangible Sustainability – (Power/Cloud/Remotes)

Even though it is much talked about, sustainability is the least important factor driving investment in 2024 according to IABM research. Sustainability is much more important to specific types of buyers such as public broadcasters where it is the 3rd most important driver. There is still a misconception that moving to public clouds means there is an improvement in sustainability. Cost improvement and public pressure are the main drivers of sustainability. Carbon footprints are typically part of the RFP process.

The move to digital transmitters shows specific maturity in carbon reduction.

Red flags are popping up such as they did with crypto-currency mining: the building of massive AI-focused supercomputer data centers will require millions of chips and huge amounts of electricity. No doubt they will get used within media, which again points to the fact that being in the cloud doesn’t mean you are sustainable by using an “AI Factory”.

The “right-to-repair” laws are encouraging repair of electronic goods instead of replacement that leads to a reduction in electronic waste.

Vertical/Adjacent Markets – (Corp/Education/Government/Social)

Many of the businesses that use A/V equipment outside of M&E were typically only doing teleconferencing in the past. Corporations now have full-blown local and remote studios. Many are totally IP, which is exactly where IPMX is a great bridge between M&E and other market areas. From a vertical market point of view, Create products are mature, with merely few that are early adopter. Similarly with Produce; the early adopters have higher resolution productions. With Publish, there is a wide range: social is commodity as is straightforward playout. There are some unique aspects happening with newer live implementations. New monetization methods are using various AI undertakings from M&E, whereas promotion endeavors are undeniably mature as expected. This being said, new concepts are always proceeding to outstrip the competition.

 

Emerging Trends – (Digital Twins/Metaverse/Web3.0/Spatial)

Making a complete studio or movie set digital replica (digital twin) by using 3D laser scanners (LiDAR) improves both the speed of production changes as well as the safety. Although Accessibility isn’t new, the cost to implement is much lower pushing more accessibility options.

Spatial computing combines the physical world with a virtual one. We see this with the newer headsets that create immersive environments or overlays in physical ones. Technology like the Apple Vision Pro isn’t bleeding edge, however the limitation of cost for the general public and limited apps are holding it back from being mature. Massive layoffs from the tech giants are making the near future of the metaverse and Web 3.0 uncertain.

New techniques in having personalized storytelling and video highlights on the first screen are catching up with social media, hence transforming TV into a digital first experience which is a key area to engage GenZ audiences.