No matter where you are in the modern world, digital video is most likely in every one of your spaces (and if it is not, you are probably carrying it in your pocket by way of a smart phone). What does it take to record, encode, decode, edit and deliver digital video assets and virtual experiences—from live concerts to breaking news stories to sports events as well as surveillance footage and digital medical records? The short answer is a whole lot of talent and technology.
Encoding an OTT (over-the-top) ladder can be a demanding job. To satisfy the needs of viewers on a variety of screen sizes and resolutions, like 4K TVs and smartphones, you will inevitably end up with an encoding ladder that has many rungs. For every rung, or every step on the ladder, typical encoding scenarios will require an encoding instance.
If you’ve ever developed or created something, you likely realized it is rarely sufficient to only test it by yourself. While “in your lab” everything works as you imagined it, when applied to the real world you may quickly find that you missed something—whether it’s because others use your product in a different way than you intended, or that your product was tested on a fast computer and now must perform on a lesser-powered system. For this reason, it is extremely valuable to have reliable partners to assist with independent testing and validation.
Blackmagic Design’s DaVinci Resolve Studio is a unique NLE (Non-Linear Editing) solution available for all major platforms (Windows, Linux and macOS), including a version that is running on the new Apple M1 chipset. The number of project rendering profiles that are already included in DaVinci Resolve Studio is quite impressive. However, especially on Linux (but also on macOS), there are some limitations. The MainConcept® Codec Plugin for DaVinci Resolve Studio was designed to bridge this gap and enable software codecs with encoding profiles that are not natively available.
We talked to Alexander Trubin, the director of the Alma TV unified network management center, about results of the switch to satellite and the company’s future plans.
As the AoIP debate continues to confuse and delight in equal measure, what is clear is different scenarios require specific solutions. So is there a solution that encompasses open standards and existing proven AoIP technologies to the benefit of all?
The popularity of OTT broadcasting is really helping to drive the growth of immersive content, and this presents both opportunities and challenges for the broadcast audio world. Mixing in immersive allows the audio engineer to create a sense of envelopment and realism like never before, but as channel count and mix complexity increases, the importance of neutral, uncoloured studio monitoring with precise imaging becomes even more important. At Genelec we’ve long been involved with designing monitoring systems that are scalable from stereo to surround to immersive, so here we’ll examine some of the principles of immersive audio, and some of the considerations that audio professionals need to be aware of.
CEDAR Audio is a UK based company committed to noise suppression, speech enhancement and audio restoration. It has focussed exclusively in these areas for more than three decades and is the recipient of numerous accolades including an IABM Design & Innovation Award, two Cinema Audio Society Awards and an Academy Award for services to the movie industry.
Spectator sports are most engaging when audiences don’t know what’s going to happen next. For the sports broadcasting industry itself, the playing field in which they operate has undergone many exhilarating changes over the last few years. In many cases, these have been accelerated and exacerbated by the Covid-19 pandemic. As stadiums and venues that were once packed with excitement and atmosphere have been forced to close, we’ve witnessed the increasing significance of over-the-top (OTT) platforms becoming the digital delivery system for the enjoyment and adrenaline sports fans around the world have been missing.
There have been many, many words written recently about remote production, indeed the wider world of remote working generally, spurred on by significantly increased use during the pandemic. But remote production didn’t suddenly spring up overnight, either as a concept or reality. Let’s step back first. We’ve seen huge growth in the use of IP bonding across sports, especially in the last five years: from single-camera streaming to complex, multi-camera productions, often on the move. From the Rugby World Cup, where LiveU technology was used not only to gather content but also as a disaster recovery solution by ITV Sport, to the FIA World Rally Championship, Austrian football and facilitating coverage of the Spanish lower leagues, the list goes on. Accompanying that growth has been the rise of remote production. Why is that? What are the benefits?