France Télévisions once again chooses the French Open to test the innovations of the television of the future. During this fall edition, France Televisions, in partnership with the French Tennis Federation, shares its vision of tomorrow’s uses materialized by spectacular and unprecedented experiments around “Augmented Tennis” within the RGLAB on the site of the international tournament from September 27 to October 11, 2020.
For this 2020 edition, France Télévisions has imagined and designed, together with exceptional technological partners from France and abroad, an ambitious program based on emerging technologies and functionalities or those developed specifically for the occasion. The aim of these full-scale tests is to imagine the TV of the future and the digital uses of tomorrow, for the benefit of an increasingly demanding and technophile viewer. The Roland Garros’ very high-level technical infrastructure, the profusion of high-quality images and formats and the incomparable know-how of France Télévisions’ technical and production teams provide a playing field conducive to innovation and the development of new functionalities, some of which could be integrated into our future digital products and services.
Throughout the tournament, France Télévisions will offer to explore the following themes using live images from the Philippe-Chatrier court:
– Increasing interactivity: Making the user the actor of his viewing experience
– Augmenting reality: Enriching the user’s vision with augmented reality
– Increase Quality: Review past images in UHD with artificial intelligence
– Increasing Speed: Testing 5G Video Transmission and Delivery
Smart TV, web, smartphone… France Télévisions is always looking for new ways to innovate and to invent tomorrow’s services. To that end, for several years France Télévisions has been interested in Augmented Reality (AR) which fast growing market will undoubtedly change the way people consume television. With the announced arrival of the first truly mainstream AR glasses, we strongly believe in the potential of this technology as a medium for mass information, education, and entertainment among spectators. Therefore, France Télévisions is offering this year an unprecedented viewing experience of a Roland-Garros match on an augmented reality headset, foreshadowing what the future of sports broadcast will look like.
This set up allows real-time information to be viewed from the Philippe Chatrier court. The augmented reality headset Hololens 2 displays the live match in a player and 3D visualizations enhance the experience by displaying data. These data, refreshed in real time, give the casual viewer a better understanding of the action on the court, while seasoned fans can customize the information displayed to further their analysis of the match. Live match data is provided by Hawk-Eye, players information by SMT and the live video feed from Philippe Chatrier court is encoded and delivered by EasyTools and Harmonic. Everything is integrated in real time into the headset thanks to the application created for France Télévisions by Immersiv.io. This application was developed on Unity, a real-time 3D engine originally dedicated to video games but very popular for creating mixed reality applications.
This experience allows the spectator to have access, at any time, to information about the players, without taking their eyes off the game. This includes general data (name, nationality, player ranking), as well as real-time data on the history of the match itself (number of successful serves, number of passing shots, faults, score). In addition, 3D visualizations, overlayed on the image, (trajectory of the ball, speed, if it is in ou out of the court) accompany the analysis of the game without missing anything on the live feed. Finally, the application also offers to visualize, in the form of heatmap, players positions when they return serves, where they hit a passing shot, and all the places where the ball contacts the floor within service box after first and second serves and aces, thus making it possible to evaluate the precision of the players’ services and its evolution throughout the match.
With these tools, spectators can control their own match experience by choosing which data to display. Soon, AR glasses will complement and even gradually replace the smartphone and provide access to this new experience anywhere and anytime. Everything remains to be invented in terms of AR and France Télévisions wishes to continue to be a player in this transformation by pursuing its exploration of possibilities offered by this technology.
Through this demonstration, we want to create a new, even more immersive experience by offering the user the option of choosing their viewing angle, from a selection of cameras available, in addition to the signal produced and directed by France Televisions teams.
Although this idea is not new and several inconclusive tests have already been carried out in the past, it finds, in this demonstration, a level of quality which allows it to be particularly relevant on smartphones and televisions.
Thereby, at any time during the viewing of a match, live or replay, the user can decide to leave the image produced (available on the France Télévisions channels) to display the camera of his choice, while keeping the sound of the commented court. Navigation between the different cameras is done instantly, without latency or loss of quality.
This new experience allows the user to be in control of his program and even more immersed on the tennis court, as if he were present there.
This experience is available in three versions:
– A TV version
– A mobile version
– The fusion of the two worlds with the smartphone which allows you to change the views on the television, and thus to take advantage of this second screen which follows us even in front of the television.
How does it work?
This experimentation was made possible by the addition of the expertise of several French and American companies which developed this prototype.
The live streams produced from the P. Chatrier court and from three cameras also present on the Chatrier, offered by the production teams of France Télévisions, are retrieved from the FTV nodal to allow the company EasyTools to process the four streams to deinterlace them, insert logos and create a 4K mosaic of these four perfectly synchronized streams. This mosaic is then encoded and packaged by Harmonic‘s VOS cloud system using Intel‘s Xeon technology. The format delivered is in CMAF/DASH, low latency, encoded in HEVC.
The stream available in real time is then broadcast via Wi-Fi on multi-camera players developed by VisualOn for Android mobiles, tablets, and boxes.
This new way of consuming sport live should find its audience. Indeed, if the feedback is positive, this new interface could, thanks to these tests, be offered soon to users of our applications during major sporting or cultural events.
In portrait mode
The consumption of mobile videos has exploded in recent years and keeps growing. Many platforms, in particular social networks, have adapted their video content related to smartphone use in portrait mode, some are even exclusively dedicated to this format. France Télévisions tries out a vertical gyroscopic video player for mobile devices.
Using a phone in portrait mode has several advantages in terms of comfort and ease of use, especially when on the move. However, most broadcast production processes generate 16:9 content, suitable for televisions and in landscape mode, but not very ergonomic for viewing in portrait mode. France Télévisions proposes to reconcile the world of broadcast and portrait use of the smartphone with this experimentation, without impacting the means of production.
This video player, which prototype was designed using “Unity”, a real-time 3D engine, allows you to play the video stream of a Roland-Garros match filmed by a classic 16:9 camera in a 9:16 player. The 16:9 video is cropped to use the entire screen in portrait mode. The user therefore only sees part of the original image. However, by using the phone’s gyroscope, or the touch controls, the user can move the frame of the image from left to right so that they can follow the action. It can also zoom in or out in the image. The demonstration also proposes to display the smartphone screen on a Samsung “The Sero” rotating TV to enjoy the gyroscopic player on a large screen in vertical mode. Not all television content will be suitable for this type of player but, in this specific case, the judicious choice of a fixed camera located in the axis of the court makes it possible to enjoy Roland Garros matches comfortably and from anywhere. or.
In the future, a specific player will have to be developed to be integrated into our mobile services. In addition, artificial intelligence algorithms for image analysis could computerize the cropping by detecting, in real time, the best point of view for the viewer.
France Télévisions had already carried out a first 5G broadcast test last year of an 8K signal produced live from the P. Chatrier court. This test, very rich in feedback, allowed us to understand the issues related to this new network and its advertised unprecedented performance.
This year, with a little more perspective and a better control of the end-to-end network, we have therefore decided to continue testing on concrete use cases in collaboration with our Orange Partner on its 5G network.
5G video transmission:
This test consists of using the 5G network to transmit video filmed in HD. For this use case, we carried out transmission tests of a video stream shot from a France Télévisions camera (Sony PM WX400), itself connected to the Orange 5G network by a DMNG 5G transmitter from Aviwest Air 320, which aggregates 2 internal 5G modems to double the bandwidth. The video signal encoded by the live module (20 MB/S) is routed via the 5G network to the France Télévisions nodal where an IP 5G reception module Stream Hub Aviwest has been installed. The quality of the network and the good coverage of the Roland Garros Stadium in 5G enabled us to successfully carry out tests in several locations selected for their editorial potential. During the tournament, several live shows were organized based on this transmission protocol.
These tests, which have proven to be conclusive, will now continue based on transmissions of images in 4K and, in the perspective of the Paris 2024 Olympic Games, in 8K which is more bandwidth intensive.
5G reception of a Multi-Camera interface
This experiment, carried out in partnership with Orange, offers to choose your camera through a dedicated interface, developed by the company Vogo. This interface was showcased during the fortnight on OPPO (Find X2 Pro) smartphones present at Roland Garros, via the Orange 5G network.
For this test, the Vogo company collected the streams produced by the Production teams of France Télévisions, encoded and packaged by its VogoBox to be pushed onto the Orange 5G network for OPPO mobiles. The Find X2 Pro is a 5G mobile from Oppo released in March 2020 that includes a Qualcomm Snapdragon 865 SoC supported by 12 GB of RAM.
The purpose of this test was to validate the good reception and fluidity of images on mobile devices via the 5G network in the perspective of a future mass offer proposed in stadia.
5G reception of an 8K 360 ° stream
In 2018 France television had the opportunity to test, in its RG Lab, the very first 8K cameras manufactured by the Japanese Sharp on the Philippe Chatrier court. In 2019, these same cameras allowed us to test, as a world premiere, the transmission of 8K Live images in 5G.
This year, with Orange and Idviu, the tests focus on live capture and transmission over the 5G network of 4K and 8K images in 360 ° in order to guarantee good transmission and reception quality:
– 5G acquisition of a 360 ° 4K or 8K video stream captured by the camera at these 2 positions
– 5G streaming of a 4K or 8K video stream from a streaming server located in the Cloud to an Oppo 5G mobile
To do this, the company Idviu installed on the S. Lenglen court two 360 ° cameras which ensured the recording of the matches.
This experiment also enabled France Télévisions to test the conversion of 360 ° images to the 16/9 format.
How does it work?
Idviu retrieves the 8k 360° image coming from the S. Lenglen court in 5G. This image is then split into four 16/9 streams which pass through France Télévisions nodal before being returned to a OB van. These same streams are then integrated into the bus mixer to be -potentially- mixed with the images of other live sources.
5G for France Televisions is also tests of 5G Broadcast
France Télévisions channels broadcast tests in 5G on the UHF band are being carried out with the company Towercast in Paris and in the suburbs. This technology from the telecom’s world makes it possible to address mobile terminals through a broadcast stream. 5G Broadcast is the promise of a complementary linear experience, seamless and independent, based on broadcast high points. These tests will also be intensified during the fortnight.
The technological gap
Over the past 10 years, there has been a real technological gap between the very fast increase in screen definition and the stagnant increase in the content definition we watch. And the gap is widening, because if it becomes almost impossible not to buy a UHD TV knowing that UHD content is not yet legion, UHD8K screens are already here, ready to invade our living rooms! The main reasons are simple, on the one hand UHD content is not at all available in most of all past and present video production, and on the other hand the main transmission networks do not have sufficient bandwidth. for UHD (TNT to name just one).
High performance compression algorithm
On the occasion of Roland-Garros, France Télévisions is teaming up with Intel to make a world premiere video transmission with the latest open-source codec “AV1”. This codec, released in 2018 by the Alliance for Open Media, allows a reduction of more than 50% of the necessary bitrate at an equivalent quality to “H.264”. The interest of this demonstration is to show that the technical ecosystem is maturing for the deployment of this codec, making it a credible candidate for the distribution of very high definition content up to 8K. This maturity is essential, because one of the biggest obstacles to its use is the computing power required for encoding (for live) and decoding on the viewer’s receiver (TV, mobile, pc, etc.). As a result, Intel demonstrates here that their Intel® Xeon® Platinum server processors have sufficient power to software-encode live video in AV1. On the other hand, they integrate directly into the heart of their latest 11th generation Intel® Core ™ consumer processors, the ability to decode the AV1. Playback is performed with the latest version of the open source VLC player which takes full advantage of the hardware acceleration built into the processors.
If in the future, improved codecs and increased network bandwidth will allow everyone to enjoy UHD (or higher) content, what about older SD and HD content?
The evolution of upscaling
How is a content processed to be displayed on a screen whose definition is 4 times (and sometimes up to 16 times) higher? The content of lower definitions unfortunately appears blurry, even on ever larger screens. This is the consequence of a basic mathematical upscaling (technique of increasing the definition of the original video to fit the higher definition of the screen). To summarize, when we go from a FullHD definition (1920px x 1080px) to a UHD definition (3840px x 2160px), it is necessary to add, between 2 pixels, a new pixel which will be an average of the colors of the other 2. These basic upscaling algorithms are the ones you often find in TVs, and while some give slightly better results than others, the operation always gives an unsatisfactory result.
Several solutions exist to improve upscaling. France Télévisions has chosen to test an upscaling algorithm developed by Pixop using Artificial Intelligence (AI).
Pixop’s software helps increase image definition by sharpening and refining details in the image, creating credible details that were not visible due to a too low definition. Pixop company developed an algorithm which they named PIXSR (Pixop Super Resolution) based on machine learning (a branch of AI based on a trained neural network). It allows, through complex filters, to recognize different elements of a low definition image to then be able to replace them with similar elements in better definition. To do this, they trained their algorithm with an incredible amount of low definition images correlated to their high definition version, in order to be able to provide a high definition response to a low definition element. The algorithm cannot invent what the element originally was, but it does come up with a credible answer that actually gives the impression of higher definition.
Impressive but perfectible results
From our various experiments, the result can be extremely impressive to the point that you might sometimes think that it is natively very high definition content.
However, the result varies depending on the type of item to be improved. The algorithm generally works well on common elements such as natural settings but performs less well on faces. Likewise, its effectiveness depends directly on the quality of the original content.
To conclude, the experiment is particularly positive because it opens very interesting prospects for enriching UHD content offers. To date, this upscaling cannot yet be carried out in real time in our TVs because the algorithms need too much computing power. In addition, these algorithms are still very young, and many improvements will be made in this area in the months and years to come.