Аdaptive broadcasting is gaining momentum quickly. Live TV is giving way to new services that not only adapt to the broadcasting bandwidth but also receive user feedback as well as generate and display targeted advertising. However, what spurred the broadcasting revolution was streaming, a technology that selects the best available quality. It is unnecessary to have a DVB-T2 antenna, a cable run to your household, or a satellite dish on your balcony to receive such broadcasts: all you need is your favorite gadget, such as a smartphone, a tablet, a laptop, or an in-car multimedia system. These broadcasts have a low buffering capacity for guaranteed delivery of the content and can even tolerate having the user temporarily exit the coverage area.
That last point is the reason why everybody loves adaptive broadcasting. Users have become accustomed to smooth content delivery in any situation, wherever they are, be it in the kitchen, on the go, or at the summer cottage. Adaptive broadcasting comes in several formats, such as Smooth Streaming by Microsoft, HTTP Live Streaming by Apple, and Dynamic Adaptive Streaming over HTTP by the MPEG Group. All of them impose similar requirements: the content at the end device should play smoothly, without freezing and switch quickly and seamlessly between different profiles. The key competitive advantages here are the Quality of Service (i.e. guaranteed delivery) and Quality of Experience (i.e. good picture even at low bitrates and clear audio even in remote country areas).
All formats are standardized. The standards specify how the manifest (playlist) and content chunks are prepared, how they are delivered, how the switching between different quality and bitrate alternatives (profiles) works, what the buffer capacity should be, what video and audio formats should be supported in chunks etc. Even if video complies with these standards, it is necessary to make sure it has made it through the entire delivery chain intact, without artifacts, hangs, freezes, glitches, or pixelation. How is this done? Flexible software solutions for video stream analysis (including live streams and video files) come to our aid here. In this article, I will explain what to look for when choosing such a solution. We will consider HLS format as an example.
The first thing that is assessed when validating OTT traffic is the general condition of the service. The analyzer should support real-time delivery (QoS) and decodability (QoE) analysis and automatic report generation. The results should be presented intuitively, whether using color coding, some kind of general score, or a stream status such as OK/Problem. For ongoing 24/7 visual monitoring, a task list with statuses is a familiar and friendly format. Each status or trigger (event) should have a detailed description to convey a general idea of what is going on.
Each project is unique, and it is important that the analyzer allows a flexible configuration of parameters to monitor. In some cases, picture freeze is a common case that should be ignored; in others, audio PID missing constantly is a phenomenon inherent to delivering the content in question. However, sometimes the stream needs to be so clean that even a few lost TS packets are a reason for alarm. In this case, the ability to set up monitoring for certain errors manually, specify error severity levels, and configure alert settings will come in handy. This is necessary to filter information that the operator gets in his or her work with streams. It is also useful when generating reports to exclude errors that can be disregarded.
It would be inefficient to hire people just so they stare at their monitors the whole day. An engineer gets distracted by other tasks and often cannot respond quickly to a problem with a stream. Most modern analyzers, including those of the Mosaic View type, can send alerts. The alerts can be configured flexibly depending on the Fault Management system in place and the work schedule of the technical support department.
Such alert systems are often based on SNMP traps, GET requests, or webhooks (e.g. PagerDuty or DataMiner). Traditional methods, such as email alerts, also exist. In addition, smartphone push notifications and instant messages are currently gaining traction. It is convenient when several alert types are supported because you can then choose the one that suits you best.
It is important that the software be intuitive and user-friendly even for novices. User tips in the UI are good to have because they help learn the tool quicker than an extensive user manual describing various cases.
Tools for deep video analysis
For most top-level tasks a problem description, a timely alert, and an action to start the response process (opening a ticket or making a call) would suffice. However, modern solutions offer tools for a deeper analysis of video data.
Diagnostics and preventative measures help avoid severe failures and isolate hidden malfunctions in the service. The more information the analyzer can give about the stream, the more means the technical department has at its disposal to prevent errors from occurring. For example, detailed stream information makes it possible to diagnose playback problems on devices, especially if the device specification is missing along with requirements to input streams.
There are many possible causes of playback issues:
- Unsupported video or audio format supplied to the packetizer (HLS packager) input
- Invalid content generated by the encoder itself (e.g. interlaced stream sent instead of progressive)
- Playlist stopped updating but keeps creating chunks
- Some tags have been added to an existing media playlist, causing incompatibility with the end devices.
It is important to have such changes monitored by the analyzer in real time. If, in addition to stream parameters, the operator also gets their changes over time, it can save him or her time troubleshooting the error.
In most cases, OTT services insert ads. This needs to be constantly monitored. Analysis of this type involves detecting special marks but also monitoring the operation of the devices that insert ads. The limits on volume level should be observed, and there should be no black frames or picture freezes.
Analyzers can be software- or hardware-based. Each of these types has its own advantages, but recently cloud services have gained popularity. They are attractive because they do not consume space and can be easily scaled. Hardware solutions retain their stable market position and work in a set-and-forget fashion, although are often hard to upgrade and maintain.
Startup time, supported operating systems, conditional access, self-diagnostics, task grouping and resource consumption monitoring—these are the key factors for choosing a software-based video data validation and analysis system in OTT.
Sometimes, any single solution, however feature-rich, is not enough. A device may reject an input stream even though the monitoring tool has detected no problems, or the analyzer may falsely detect events. In these cases, it is helpful to have one or more third-party live analyzers on hand or record the stream to analyze the resulting files (playlists and chunks) afterwards. The ability to record the stream plays an essential role in such a situation.
Sometimes it is necessary to check for picture matching (frame alignment) between profiles as well as analyze the GOP structure in a chunk. For example, in the case of HLS traffic, it is vital that each fragment of a single profile begins with a reference frame from which stream decoding can start. In addition, the picture should be identical between profiles—this will allow switching between them seamlessly if the available network bandwidth changes.
Technology does not stand still. TV services strive to improve their quality, stability, and compatibility with all end devices, content distribution media, and the methods of interacting with them (CDN and middleware). The constant influx of new market players puts pressure on the competition to be proactive and makes it impossible to afford waiting for an issue to occur. Even the simplest, rare issues can affect reputation in big ways because social media will quickly spread the word about vendor’s or operator’s mishaps.
I encourage everybody to keep up with the times and stay ahead of any possible problems—using tools that help prevent or isolate them timely.
Founded in 1988, Elecard is a leading provider of software products for encoding, decoding, processing, receiving and transmission of video and audio data in different formats. The main groups of products produced by Elecard include: professional software products and software development kits (SDKs); products for in-depth high-quality analysis and monitoring of the media content; solutions for IPTV projects, digital TV broadcasting and video streaming; transcoding and video-on-demand servers. Elecard also offers easy-to-use and full-featured end-user programs essential for fast and high-quality multimedia editing, processing, conversion and playback.
Currently the Elecard team is working both with widespread in-demand formats (MPEG-2, MPEG-4, H.264/AVC etc.), and with new formats that are rapidly gaining popularity such as HEVC/H.265, VP8, VP9, AV1, VVC, EVC. Our products are highly appreciated and widely used by IT industry leaders such as Intel, Cisco, Netflix, Blackmagic Design, etc.
One of our striking advantages is that being an integrator company, Elecard is able to supply everything the customer needs: reference design, software, components and technical support in one place.
The application area of the company’s technologies is very wide and includes software solutions for PC and mobile platforms; security and video surveillance systems; terrestrial, cable and satellite broadcasting; advanced real-time and offline transcoding and professional video quality monitoring.