After recently releasing a report on why dynamic analyst relations is pivotal to business success, Platform Communications’ Managing Director, David Lawrence, talks to IABM TV about how AR is changing and why is it more important than ever.
Virtual set technology has experienced a coming of age for both film, broadcast, and events applications. The last two years have resulted in a steep learning curve for virtual production technology, forever changing the way content is made.
The pandemic has pushed the world to digitalise almost all industries – from the use of digital video solutions replicating a retail purchase, to remote production within the entertainment sector. In this blog Dhaval Ponda, Global Head, Media and Entertainment Services, Tata Communications, discusses the changing landscape of the sports and entertainment industry and the tech-enabled exciting times that lie ahead.
IABM Adoption Trends reports annually track the adoption of specific emerging technologies within the broadcast and media sector. The purpose of these reports is to enable member companies to better understand the drivers of emerging technologies’ adoption within customer organizations. This should provide member companies more tools to better address the challenges lying ahead, from new product development to marketing strategy. These reports contain a discussion on the state of adoption of the emerging technology in broadcast and media, as well as an analysis of significant customer deployments.
This session will explore how 5G offers data rate increases both upstream and downstream, as well as the ability to manage data on the 5G connections in a more sophisticated manner than ever before. The key role 5G plays in reducing latency; through the development of new standards, it is now possible to reduce latency within live video to a matter of seconds by using multicast ABR within the CDN to the edge and leveraging edge processing within the network.
This presentation will explain how 5G creates a new ball game in many domains. The high-speed network capability of 5G will enable UHD experiences, multi-view in HD, as well as high-quality immersive AR and VR video delivery. The low-latency aspect of 5G networks is crucial to allowing high QoE for VR and AR applications. In addition, the session will explore the edge cloud architecture (MEC) supported by 5G, enabling edge processing for use cases such as sports arenas, cloud VR and gaming.
Mo-Sys is a world-leading provider of camera tracking and camera robotic systems supplying to broadcasters such as BBC, Sky, Fox, ESPN, CNN Discovery Channel, The Weather Channel and Netflix among many more. With a passion for innovation and design, Mo-Sys is at the forefront of live AR and virtual production technology with their StarTracker camera tracking system now powering more than 100 virtual TV studios around the world.
Brainstorm is a specialist company dedicated to providing industry-leading real-time 3D graphics, augmented reality and virtual set solutions for all broadcast graphics types and workflows, as well as for feature film production and corporate presentations. With more than 2,500 installations worldwide since its foundation in 1993, Brainstorm’s customer list includes many of the world’s leading broadcasters plus smaller and regional stations. Brainstorm’s flagship product eStudio is considered the industry’s fastest on-air graphics and virtual studio engine. With headquarters in Spain and subsidiaries in the US and Asia, Brainstorm is a company with a global reach and committed to innovation.
The Moving Picture Experts Group (MPEG), as one of the main standardization groups dealing with multimedia, is close to finalising the first open standards for compactly representing and transmitting dynamic 3D point clouds. First implementations of the upcoming MPEG standard for video-based point cloud compression (V-PCC) are already available on today’s mobile hardware. This talk presents some of the key features of V-PCC for dynamic point cloud coding and transmission. As V-PCC offloads a most of the computational burden on existing video coding solutions, AR content can be distributed over today’s network infrastructure and real-time decoding can be achieved on essentially every single media device on the market.