Miguel Churruca
Marketing and Communications Director, Brainstorm
XR is one of the trendiest and most recent flavours of Virtual Reality, and many understand it as the usage of virtual backgrounds, displayed on large LED video walls, for movies, drama, or live events. However, the concept of XR (Extended Reality) goes far beyond these applications, which are often spectacular, but also do not take advantage of the many possibilities this technology can provide.
As a form of Virtual Reality (VR), Extended Reality (XR) can be defined as the combination of real and virtual environments, if we assume the concept of Virtuality Continuum as defined by Migram and Kishino. According to these authors, between the “real” reality as captured by the camera and the generation of completely synthetic images there is a wide range of combinations of real and virtual elements that generate what is called Mixed or Extended Reality (MR or XR). Therefore, contents such as 3D virtual sets, Augmented Reality (AR) or even Immersive Mixed Reality (IMR) will fit into the XR concept. And, in essence, we can consider the virtual scenes or backgrounds displayed on a videowall a virtual set, that is directly displayed on a physical medium (LED wall) rather than composed using a chroma set and viewed on a different screen.
There are many applications for LED-based XR production which outperform chroma set production, starting with live virtual sets displayed in the LED walls so not only the audience at home can see it, but also the live audience on a physical show or event can see it. Also, excluding the need for green sets or chroma keying can be quite interesting for film and drama productions when virtual environments are required rather than physical scenarios and props. On top of that, we can add in-screen Augmented Reality (AR) content that can be shown live, rather than using real-time renders and combine the scene on a separate screen.
However, it is important to point out that LED-based XR technology is not an answer to any possible VR requirement, as in some cases it shows several disadvantages compared to traditional chroma keying. Issues with focus, pitch, moiré, delays and many other should be considered before deciding which method (LEDs or chroma keying) is best for our virtual content production.
When using LEDs as background/floor, the camera captures the combined image of the background rendering and the characters. However, there is no technical reason why the LEDs can’t be replaced by a chroma set and vice-versa so, using one method or the other will depend on the content requirements or user preferences. As we mentioned earlier, LED videowalls for XR do have significant advantages over the chroma sets, and the most obvious one is not needing any sort of chroma keying, so the talent integration in the scene becomes easier. Also, as LEDs are light emitters, they contribute to the illumination of the scene and the characters while providing realistic reflections and refractions.
On the other side, when LEDs are used in smaller sets or closer to the camera, the limitations of this technology may become significant. In broadcast, or sometimes drama environments, we are often short of space, and therefore we will find smaller, and closer to the camera, LED video walls, which can result in moiré artifacts when capturing the scene on camera, depth of field issues, or restrictions in the camera movements, for instance. These issues increase when using corner or shaped LED installations. And when using LED floors, it becomes more complicated to realistically integrate the talent on the set, as the shadows may need to be included as a separated process, even using some kind of talent tracking to know where to place them - remember LEDs are light emitters, so talents and objects do not drop shadows or reflections on them as in a “standard” surface.
So, as said before, XR content can be created using both LED video walls and chroma sets, at the same time that it allows for combining both techniques to produce the final content. On top of that, users not only can include additional AR elements on top of the talents, but also tele-transport characters shot in remote places, live, and insert them on the scene. Tele-transporting a talent to an XR scene implies considering the keyed talent an AR element, because, when working in a 3D environment, the talent must be inserted with correct size and perspective and eventually interact with virtual elements by dropping shadows over real or virtual elements or producing reflections, like Brainstorm’s 3D Presenter technology allows for.
Including AR elements on XR content can be done in different ways, depending if it is required to display these synthetic elements behind or in front of the talent. This is irrelevant when using chroma sets, in which the character is keyed over a virtual background, and the AR objects are just additional elements in the scene, and in most cases, they can be rendered using the same workstation that composes the final scene. When using real scenarios, AR elements must be placed in front of the talent(s), but when using LED-based XR we have the option of adding the AR on top of the talent and/or display them, in-context, behind the talent, using the videowalls. As we see content with the perspective of the camera it is possible to project AR elements integrated with the background scene in the LEDs, and it will change perspective accordingly. And it is also possible to render these elements in front of the talent, for which they will be rendered using an additional workstation which received the same tracking data used to display the LED content. This will ensure performance and a correct perspective matching of the whole composition.
It is worth noting that the mentioned setups and workflows with AR require a careful management of delays and other issues like depth of field to ensure that the AR elements are correctly placed in the scene, and in accordance with the talents and the camera views. And, in order to ensure a seamless integration between the AR elements and the background, they should be rendered using advanced rendering techniques. These include not only game engines like Unreal Engine, but also PBR (Physically Based Rendering) or even real-time ray tracing, which takes into account lighting, object interactions etc., if we need the final result to be indistinguishable from reality.
So, summarizing, XR is more than just placing virtual backgrounds on a large LED video wall, and provides a plethora of possibilities for live and virtual production, also in combination with chroma sets if required. Because of all the above and having been at the forefront of virtual technology for the last 30 years both with chroma and LED/projection environments, Brainstorm has produced the comprehensive White Paper Brainstorm Guide to Understanding XR, to clarify concepts about XR technology, its applications, workflows, advantages, and disadvantages.
The Brainstorm Guide to Understanding XR can be downloaded from here: https://xrguide.brainstorm3d.com.