A Zer0 Friction Approach to Adaptive Monitoring and Optimizing Assets

Maximizing OTT monitoring and how the cloud fits in

Consumers do not care how content gets delivered to them. They want to watch what they want, where they want. But they do expect a certain level of quality and performance and will quickly move on to the next provider if this is not delivered.

Satellite, cable, and telco operators are increasingly using OTT delivery to supplement and even replace traditional media delivery methods to engage and maintain viewers. But to maintain a high quality of experience for their customers, operators need a way to monitor hundreds — sometimes thousands—of channels and signal points without compromising real-time error detection. In most cases, the immense scale of their service offerings makes continual visual monitoring of all streams impossible.

The move to OTT requires service providers to deploy a much more dynamic infrastructure, one that can scale on demand as viewing surges during peak times, one that enables the launch of event-specific and time limited channels, one that enables infrastructure to be built on the fly. The business advantages are great and multi-faceted but taking full advantage demands a comprehensive, agile and operationally sophisticated monitoring solution to match.

No one builds out an IP based facility or transforms their production and distribution base to operate in the cloud without banking on the economic benefits that ensue. But there’s no sense trying to capitalize on these opportunities if the best-case technologies are not leveraged and proper business models don’t follow suit.

The emergence of the cloud into the media production and delivery space has pushed the broadcast and media industry toward an entirely new approach to acquiring and deploying technology. Large capital expenditures (CapEx) are increasingly being replaced by operating expense (OpEx) budgets that are more flexible and aligned with the operational requirements of today’s broadcast facilities.

And in addition to the migration to cloud, another revolution is taking place in parallel - the adoption of flexible and floating software licenses and the business models that finally allow broadcasters to maximize asset utilization.

60-70% of the time an asset sits in a rack unused. The same is also true for cumbersome and convoluted licensing practices that require specific licenses for each feature. For example, a customer may have to purchase UHD licenses that only get used for an occasional project. It’s an extremely expensive and highly inefficient model, especially when there is an alternative at hand.

Adaptive Monitoring

Enter Adaptive Monitoring - a mechanism that allocates resources where they are needed, reducing the level of monitoring and analysis, and providing content owners with the freedom to scale. In conventional monitoring deployments, the cost of licenses and compute power for full-time monitoring would place a ceiling on the number of points that could be monitored. But with Adaptive Monitoring, operators can mix and match different monitoring modes and have the agility to balance CPU resources against their need to monitor streams in real time. With the freedom to implement different monitoring modes within a single deployment, operators can take advantage of automated and adaptive resource allocation to get the most value from their available server resources.  The result is a system that matches the cost of monitoring to the value of the content at that particular node in the delivery system, minimizing costs while maximizing the breadth and depth of monitoring.

Adaptive Monitoring allows monitoring in any of three operating modes—full monitoring, light monitoring, and extra-light monitoring—on a per-input-source basis. Using thresholds set by the operator within the software or triggered by an API connecting the system to external devices monitoring the overall ecosystem, the system automatically adapts to ensure optimal monitoring of all streams at all times.

While Adaptive Monitoring is invaluable in optimizing monitoring using on-premises hardware, it yields even greater benefits for cloud-based operations. The ability to dynamically change utilization of instances based on need at any given time can dramatically reduce the operational costs of cloud processing. Moving away from physical hardware, operators no longer need to scale their equipment and infrastructure to support maximum channel capacity—or leave hardware unused during non-peak times. The combination of Adaptive Monitoring and cloud-based processing resources allows operators to move toward a more economical pay-per-use model in which they can scale instances to match their need.

Whether processing takes place on-premises or in the cloud, Adaptive Monitoring ensures that if the system detects a problem on a channel, that channel is automatically switched to full monitoring mode. The dynamic nature of this model makes this an ideal solution for the many operators that need efficient high-density probing and monitoring of OTT channels without compromise.

A Zer0 Friction Approach

Imagine an OpEx model where the broadcaster only pays for the time its products are in use. As a production facility in London comes offline, for example, its product licences can be switched off or reassigned to its OTT or playout division anywhere around the world without incurring any penalty for doing so. This ability to move quickly from application to application can be described as ‘zero-friction’.  A zero-friction business model allows for product deployment wherever and whenever it is required, in turn rocketing asset utilisation up to 80 and 90 per cent.

Thanks to the processing speed and data throughput of COTS hardware customers now have the technology they need to manage media across their entire organization. Software running on a common platform provides a whole host of functionality to further improve resource utilization.

Building on this, flexible software licenses are now ready to enable or disable functionality within a single software distribution. These licenses further improve flexibility as they can be purchased on a pay-as-you-go basis. Broadcasters want the freedom to allocate their licenses not only where and when, but for whatever product, feature and function they desire, maximizing flexibility.

Performing automated analysis of video and data on thousands of signals while keeping costs down is made possible with sophisticated Adaptive Monitoring and optimized with the agility of a zero-friction business model.  

www.tagvs.com

Cloud-based technology taking weather by storm

Ragnvald Moberg

Vice President of Media, StormGeo


For decades television stations, news channels and streaming services have had no choice but to rely on limited technologies to deliver the graphics that support the weather story. Finally, there’s a solution that removes the complex technical and logistical difficulties and eases workflow; future-proofing the weather forecasting industry against rapidly changing technology and unexpected global events.

Hardware Has Been Holding Weather Back

One of the most problematic issues facing weather programs is the hardware that has long been a necessary part of broadcast. With monolithic weather solutions, specific dedicated equipment and expensive on-premises infrastructure, forecasters have relied on an awkward, overgrown setup to provide their service.

The hardware solutions that have been the mainstay of weather forecasting since the 90s are fast becoming historic relics. They are costly, complex to run, and have an ever-increasing maintenance burden.

A Fragile Workflow

The classical systems require faultless integration between graphics, hardware and data sub-deliverables. From an operations perspective, there is potential for failure when multiple disparate elements are expected to perform seamlessly. When issues occur, the complexity of systems subcontracted to diverse and interdependent players makes it difficult to determine where the system has failed, and who is responsible for resolving the problem.

Data Challenges

Inflexibility within the system and the dataflow make traditional weather forecasting setups difficult to modernize. Specific skills and competencies from independent subcontractors mean that problems arise wherever platforms are not fully aligned, and as such continuous development is extremely complex.

With dated tech and bulky processes, there’s little wonder that mainstream broadcasters can update weather graphics only every 5 to 8 years. Yet, they risk losing out to competitors as expectations for user experience grow.

Add a Global Pandemic…

These problems have long been inherent in an increasingly outmoded system, limiting the flexibility and dynamic potential of the industry. But the Covid-19 crisis of 2020 has thrown the difficulties into sharp focus. Existing vulnerabilities were exacerbated as weather anchors were forced to deliver forecasts away from the studio environment, making technical support even more complex than usual.

A Solution on the Horizon

The traditional weather system is ripe for an update. Fortunately for the weather forecasting industry, there’s now a simple, effective solution that addresses the current challenges and takes weather forecasting forward.

StormGeo Studio is a new model for the weather forecasting industry that completely removes the need for a complex system, and replaces it with an intuitive, all-in-one solution that future-proofs weather forecasting for a rapidly changing technological age.

Cloud Technology, Local Capability

StormGeo Studio is a system that uses a hybrid infrastructure. It utilizes new technology, making it easier to operate, maintain and continue to develop the software with well-known standardized technologies. Using cloud services through the browser and rendered locally, all that is required is an ordinary computer.

In simple terms, the traditional hardware and intricate support network becomes obsolete. It is fully replaced by a PC or Mac. StormGeo Studio removes the need for specialist support capabilities by using standard, accessible technologies such as Javascript, HTML5 and WebGL. Graphics are overlaid on any stream, be that Over-The-Top (OTT) or fixed-line broadcast. It also provides a user-friendly interface for meteorologists to edit forecasts for any location at short notice, without needing to change individual templates.

This revolutionary hybridization of both local and remote resources creates effective and extremely robust performance capabilities. The process is fast and simple.

No Specialist Hardware

With only a computer required, StormGeo Studio is defined as an agnostic system – a system capable of full operation from any platform where Google Chrome or a headless browser is available.

StormGeo Studio’s agnostic nature makes it easy to introduce, operate and get on air in a very short time for clients in all tiers, including content providers for whom streaming is a new or growing platform. Producers and weather talents can use a well-known interface, with no need for specialist training.

With StormGeo Studio, staying up to date is no longer a logistical nightmare. Updates are versioned and easily accessible through the browser, removing the need to make allowances for the currency of local hardware or operating systems. The entire system is simplified and easy to maintain on a day-to-day basis. Continued development becomes a rapid and effortless procedure.

The trends all point towards further growth in IP-based or web platform and software solutions, rather than the monolithic setup that has been the mainstay of the industry. StormGeo Studio enables weather forecasting to move with the times, available not just to tier 1 broadcasters, but also to digital and OTT publishers.

Content-as-a-Service Model

Moreover, StormGeo Studio is a Content-as-a-Service (CaaS) model, with graphics, weather intelligence and playout bundled in one solution and provided with subscription options to suit forecasters’ needs.

The subscription model not only gives clients control over costs depending on use requirements, but in removing reliance on subcontractors, the focus can now be on disseminating high quality weather content.

A CaaS system is set to be the future for weather forecasting services across the board. The StormGeo CaaS service (built together with Singular.Live) makes advanced use of graphics, such as interactive overlays, available for audiences across many channels.

As a category, subscription services are already extremely popular for well-known channels such as YouTube and social media. StormGeo Studio will expand the model into weather forecasting, causing a leap forward in the way that the weather story is told, with graphics, overlays and images providing the best possible viewer experience.

Reliability in Ever-Changing Times

With the difficulties experienced in broadcasting during 2020, and continued uncertainty around Covid-19, not to mention the looming climate crisis, broadcasters need a solution to safeguard service provision, whatever the future holds. StormGeo Studio’s introduction of a robust and flexible system could not be timelier.

StormGeo Studio ensures straightforward, reliable performance for studio and outdoor broadcasts, supporting and easing production for operations and weather talents, and offering interest for viewers, even during unprecedented events.

And as consumer expectations continue to grow, the viewing experience can be enhanced to add tailored overlays and real-time interactivity.

StormGeo Studio is the ultimate service for the weather forecasting of the future. It replaces an outdated, unwieldy, and expensive operation with a future-proof, flexible, cost-effective service that offers reliability and an exciting user experience.

You can try StormGeo for free: https://www.stormgeo.com/products/studio/campaigns/weather-in-a-box/

IP monitoring in complex data transfers

Hartmut Opfermann

Senior Solutions Architect, Technology Consulting, Qvest


Digitalization has triggered a technical revolution among TV broadcasters too: with transfer rates of on average 100 Gigabits per second, huge volumes of data are nowadays coursing through broadcasters’ fibre-optic cables. But with the move away from analogue technologies come new challenges for live broadcasts: even minor disruptions in data transfer over IP networks today quickly damage image - and cost money. IP monitoring puts programme providers in the picture.

Challenge with live broadcasts

When millions of people around the world were eagerly awaiting the first kick-off of the European Football Championship on 11 June 2021, probably only a few were aware of the technical developments behind today’s razor-sharp live images. From the moment the ball starts rolling at the latest, the lines run hot in the broadcasting centers around the globe. This is because TV spectacles such as this major sporting event have long been down to a complicated interplay of digital IT technologies. This is especially true for live transmissions with transfer rates of ten Gigabits per second or more with Ultra High Definition (UHD) transmission quality. Then, during a regular soccer match, several Terabytes of data course through the lines of the broadcasting stations - and then the network infrastructure of the broadcasters also has to deal with “kick and rush”.

Even minimal interference or even interruptions in the transmission of signals in real time can sometimes have a fatal effect on the broadcaster’s reputation. Worse still, an involuntary disruption, for example during a commercial break, can really cost money. Unimpaired transmission quality is particularly relevant for crowd-pullers such as major sporting events - whether before, during or after a soccer match, for example. And moreover, the reliability of TV transmissions is just as important as the content, and especially for live events, the transmission technology is a decisive factor if both image-damaging and economically burdensome consequences are to be avoided.

“Ghost match” during data transfer

A typical problem so far has been that the IP technologies adopted from the IT industry require monitoring of the state of data flows so that faults during the transmission of uncompressed live video signals can be quickly analyzed and remedied. Unlike in the streaming environment, where lost data units can usually simply be forwarded without affecting viewers, packets in a real-time transmission then run offside forever. But hardware defects too - for example in a switch, a cable or the laser in the fibre optic interface - then become a “ghost match” for broadcast IT administrators. This is where IP-based network topologies come into play: in the past, the broadcasting industry used SDI technologies to transport a single unidirectional signal over SDI cables, but with IP-based transmission, multiple bidirectional data streams can be transmitted over a single cable. Among other things, this enables more camera feeds, higher resolutions, virtual reality functions and live production directly in the studio or venues.

IP monitoring: preventing wasted time with the right flow

IP monitoring solutions enable broadcasters to analyze flows in the wide area network (WAN) and thus improve troubleshooting. In practice, two different IP monitoring methods have become established: NetFlow and sFlow.

Netflow is a technology originally developed by Cisco in which devices such as routers or layer 3 switches export information about the IP data stream within the device via UDP. It is well suited for billing IP traffic on Internet routers. UDP datagrams can then be received, stored and processed by a NetFlow collector. This accumulated information can be used for traffic analysis, capacity planning or analysis in the context of quality-of-service strategies.

As a counterpart to this, sFlow (Sampled Flow) has become established in recent years. This is a packet sampling protocol designed by the InMon Corporation that has found wide acceptance in the networking industry. The decisive difference to Netflow is that Netflow exports statistics, while sFlow exports sampled packet headers from which the statistics are generated externally.

sFlow can be embedded in any network device and provides continuous statistics on each protocol (L2, L3, L4 and up to L7) so that all traffic on a network can be accurately characterized and monitored. These statistics can be used for overload control as well as troubleshooting, security monitoring or network planning. The advantage is that this reduces the amount of information that ultimately has to be processed and analyzed. This leads to a low load on the CPU and the data line.

The future of media is unplugged: how digital transformation can prepare media brands for a new reality

A rapidly changing media industry

There is no doubt media convergence is happening. Audiences for linear television and media are decreasing slowly, and we expect a collapse in the near future as audiences shift rapidly from traditional to connected media. Using radio as an example, in the UK recent data from RAJAR (Radio Joint Audience Research) shows a dramatic 40% drop in listening hours by 14-19 year olds. It is a trend mirrored in linear media across the board, and that shift has consequences for revenue. Ad spend is moving online, away from traditional channels. The latest report from Insider Intelligence, Worldwide Digital Ad Spending 2021, predicts that digital ad spend will reach $455.30 billion this year, following a 15.7% contraction in traditional ad spending in 2020.

As revenues and audiences shift, media groups merge to benefit from economies of scale. This leads to increased media concentration and rationalisation. The borders between different platforms are slowly fading and media brands will have to ask how they will face this convergence by reinventing themselves instead of attempting to extract a little bit more from their existing model until it breaks.

Greater competition

Accessibility of technology is another driver of transformation. Where media brands used to have the best technology, nowadays everyone has access to similar technology at a fraction of the cost. The barrier to produce and distribute content is gone and everyone is able to reach large audiences. As a result, competition has exploded and new content is sent into the world every second of the day.

Currently, media groups are trying to face the disruption by focusing on lowering costs, but they are not doing so in a joined-up way. They are making incremental changes instead of systemic ones, such as downsizing technical teams. Eventually, stretching production resources to their limits has an impact on the core of media brands: the quality and amount of their content production. This cost cutting eventually leads to value erosion and damage to brand reputation.

Future proof media brands

The transformation is not yet over and no one knows where it will end. It's also unclear what business model will become dominant in this new reality.

However, two things are clear:

First of all, media brands will have to get closer to their audiences - that means faster throughput of the production, faster feedback loops and interaction with the audience, and the need to extend their reach over more platforms.

Live production will remain one of the greatest skills and differentiators for media brands in the future especially as they continue to recover and rebuild following the impact on live events of the covid pandemic. It is more difficult to produce, but offers great advantages:

  • Live production creates direct engagement and interaction with the public.
  • Live production saves time.
  • Live production creates an ambience and a "brand voice" that can differentiate media brands from their competitors.

Secondly, automation and robust workflows will allow media brands to focus efforts where they are needed: in building audiences and getting noticed - not in production. Quality and speed will be essential, but will not be enough to build a brand. The struggle is not simply to get content on different platforms, but to get noticed and to engage the audience, and to create continued engagement, thereby instilling brand loyalty. Furthermore, there is a need to produce more content at a faster rate, which can be published on different platforms without compromising on quality.

Media production: unplug or else

Media brands need to be prepared for anything. That means investing in a fixed and robust virtual backbone that gives production teams maximal flexibility. In other words: go digital and adopt "as-a-service" ways of working rather than investing in on-premise hardware. Without that adaptation, media brands are tied to solutions and technology that cannot be adapted to a fast changing environment. As a result, production teams will not be tied to places and processes, which allows them to react in real time to every new reality the future will bring. Of course, physical control surfaces will remain, but the intelligence and processing will be virtualised and distributed.

This new paradigm forces media brands to compensate for the downsizing of teams by introducing automation wherever possible. This reinvention needs to be natively digital rather than a replication of analogue processes (Digitalise v. Digitise).

The main focus of using a software is that the production achieves quality requirements regardless of what media brands do with the production result or on which channel they put it. The right software requires nearly no training, is painless to adopt and is the solution technicians need in the era of the smartphone. It will no longer be necessary to invest in hardware that is provisioned to handle peak needs (CAPEX) since software can scale up and down as needed. This results in cost savings and a cost structure that grows and shrinks with the activity.

There is no doubt that the future of content production will be unplugged. That is why On-Hertz adapt their business models to match the requirements of media brands: not only offer traditional fixed one-off licenses, but also SaaS and on-demand. That will ensure the budget remains predictable and in-control, but also offer the flexibility and scalability that is required.

About On-Hertz

Audio production unplugged

On-Hertz brings media brands closer to their audience. We allow creators to go from idea to high quality content faster, live or pre-recorded, from niche podcasts to shows with millions of fans. On-Hertz audio-first software suite empowers you to break free from the limitations of legacy hardware environments. We build solutions and apps that are easy to adopt, a breeze to use, and offer full control over your final product.

On-Hertz is where the digital transformation of your media brand starts.

  • Grow your audience by tapping the potential of live content
  • Produce more content at superior quality
  • Simplify and speed up your production process

Faster, better, more: the evolution your media brand needs.

Interactive Live Streaming: How to Engage Audiences Worldwide

Over the past year, we’ve seen video communication on an unprecedented scale, from remote work to virtual conferences and live-streamed events like webinars and town hall meetings, theatre and music. Live Streaming, for both business and pleasure, has become part of our everyday lives — and this shift creates new unique opportunities. When physical meetings and events are cancelled, video communication is the only way to keep interaction alive. To share a live video stream with a global audience with access from any device, interactive live streaming comes into play.

What Is Interactive Live Streaming?

While live video streaming typically just delivers content in a one directional way, interactive live streaming allows the audience to participate and interact e.g. via chat, Q&A, polls. This also applies to bidding, betting or online gaming where interaction is mandatory. This is often referred to as a ‘lean forward’ experience compared to ‘lean back’ scenarios like watching an event via stream. In a lean forward experience, the audience is an active part of your content, having the chance to influence it. Especially during the pandemic interactive live streaming offered a new opportunity to connect with the audience and engage viewers.

However, there are technology challenges. For starters, latency (the time to deliver the stream ‘from glass to glass’, from the camera to the viewer) of an interactive live stream needs to be kept ultra low to maintain smooth interaction. This requirement applies on a global scale for any network. As most users are now using mobile phones as their primary internet device, it is important to enable mobile-first easy access directly in the browser. For vendors, easy integration of live streaming into their web applications is important to focus on their business priorities.

Infographic: Interactive Live Streaming with nanoStreamCloud

Some options, such as the advanced ultra-low latency live streaming platform nanoStream cloud-enables service providers to stream content to large audiences with almost no delay, while letting viewers interact in the most natural fashion, offering the closest experience you can get to actually being there.

Use Cases

On Stage Events: Town Hall Meetings, Podium Discussions, Cultural Events.

Adding interactivity creates new opportunities for performers and audiences to connect. This is a new milestone in user experience for live streaming. Event organisers are now able to host live concerts during which viewers can request their favourite songs, cheer, buy merchandise and interact in several other ways.

Besides, during the COVID-lockdowns, several new use cases from the business world emerged: A large corporation was looking for a solution to host town hall meetings and conduct polls with their employees at the same time. Another company needed a project to be completed that had the mandate to inform the public and offer live Q&A to remotely connected participants. Without traditional hosting options, interactive live streaming was an inexpensive way of accomplishing audience engagement for these and many more scenarios.

Overview Use Cases

Hybrid Events Are Here to Stay

We are likely to see an increasing number of these events in the near future, because – if done right – they actually offer some key benefits over traditional platforms. Panel discussions provide a good example: panelists are not only live on stage, but also connected online. Scalable tech to stream events and meetings reduces the financial and environmental cost of travel as well as the individuals’ attendance time. Additionally, when interaction is enjoyable and in real time through low-latency streams, these solutions allow for networking across the globe and truly multi-national meetings.

Monetizing Use Cases: Online Gaming, Auction and Online Retail

Ultra-Low-latency streaming is also the preferred choice for online live retailers and many auction houses and real estate sales agents to present their offers in the virtual world ensuring bidders enjoy the excitement of the experience even if they don’t win the bid. Online gaming has increased drastically with the pandemic relocating entertainment - especially now with the legalization of online gambling in multiple countries.

Requirements for Successful Interactive Live Streaming

Latency - The Competitive Edge

At first it is all about how fast a stream can be delivered to determine what solution can be used for interactive use cases, but what is next? How to set-up an interactive live streaming solution successfully? For a successful business roll-out, the requirement that needs to follow is a clear understanding of the monetized business model. This determines other factors that come into play and that are part of the interactive live streaming workflow such as ingest, global network (CDN), player running on all devices and browsers, easy integration, robust service for 24/7 operation, requirements on the end-user’s side (browser-based use, visibility on mobile etc.).

Latency depends on many factors and can only be achieved with full control and insight of all components of the live stream end-to-end.

One Shop Solution for highest quality of service

The complex workflow of interactive live streaming asks for end-to-end control. Only then, all factors can be maneuvered through it according to the use case requirements. Mix and match of technologies can add a great layer of complexity to live streaming, besides that it can create streaming issues or increase the latency and time it takes to present content from the camera to the viewer. The easiest and most reliable way to live stream interactive content is adopting an all-in-one solution. This means that operators will have full control over their workflow: ingest, network, delivery and player and everything in between to keep the quality of service high. On top of that, businesses benefit from a solution that is easy to use and can be branded and integrated into business web sites according to the corporate requirements. For better insight and quality of service, data metrics and analytics help you monitor and analyze live stream performance. A reliable service with instant live streaming 24/7/365 is mandatory with the goal of 100% availability and automatic failover in case of any network issues.

Leverage Low bandwidth on a global scale

To have a global approach, the live streaming workflow must suit a powerful and reliable Content Delivery Network (CDN). This means having an infrastructure that can deliver streams everywhere in the world. At present, many operators are taking a trial-and-error approach when it comes to the settings used for interactive live streams whereas they should be relying on modern technologies. The best solution will use ultra-low latency, adaptive bitrate playback to cope with all different types of network situations. It will also provide analysis of the live stream. Ultra-low latency ensures the delay between the presentation and the audience seeing it is well under one second, while the adaptive bitrate playback handles changing bandwidth. It works by reducing the stream quality, encoded with a lower bitrate, based on the available bandwidth. As soon as the bandwidth improves, so does the quality of the stream.

Infographic Use Case Online Games: Bandwidth Adjustment

Deliver interactive live streaming to desktop and mobile devices

One of the greatest challenges operators and providers face is ensuring that the best possible user experience is granted on both desktop and mobile, at all times without any plugins and accessible from any browser. This requires a robust live streaming platform and powerful technology.

Make it happen with the right interactive live streaming platform

Even though streaming is widely available today, not every solution is suited to delivering interactive live video experiences. Combining all the requirements is challenging, but is the key to success.

Recently awarded by Streaming Media, nanoStream Cloud includes all the necessary features to help businesses across several industries engage their audiences with interactive live video:

- Integrated solution and end-to-end control

- Easy to integrate

- Global delivery network (CDN) to deliver worldwide

- Player with Adaptive Bitrate Playback

- Running browser based and on all devices including iOS

- Integrated Analytics to improve customer experience

- Light weight, stable and robust live streaming solution

Questions? Reach out to our team at sales@nanocosmos.de to discuss your interactive live streaming use case.

OTT demands for viewer satisfaction

Erik Otto

Chief Executive, Mediaproxy


With more people watching films and episodic TV drama on streaming services such as Netflix and Amazon, OTT is coming ever closer to dislodging linear television as the dominant vehicle for video and media distribution. Mediaproxy chief executive Erik Otto outlines the technological challenges of this and looks at why monitoring streamed outputs is crucial to guarantee the viewing experience.

OTT services came into their own at the height of the Covid pandemic, providing streamed entertainment directly to people locked down in their homes. In reality, this was an acceleration of what had been happening in the years prior to 2020 and boosted the already growing popularity of streaming.

Originally, OTT was promoted as a video-on-demand (VoD) service that enabled viewers to catch up on TV shows they had missed. This model of streaming was expanded upon by the likes of Netflix and Amazon Prime to offer a large, readily accessible library of films, TV dramas and comedies shows. This was further enhanced with the addition of Ultra HD/4K HDR visuals and Dolby Atmos immersive audio to create a viable and compelling alternative to linear digital terrestrial television (DTT). Netflix and Amazon have since been joined by Disney+ and AppleTV (which added 4K capability earlier this year), with Sky in the UK dropping dish reception in favor of streaming through the new Glass TV.

VoD streaming, for both TV-style viewing and catch-up, is set to continue growing in the coming years. Many services are likely to be hybrid platforms based on subscription models (SVoD) but with the option of being ad-free. There is also an increase in demand for advertising-based VoD (AVoD). This offers free access to content with commercials and is popular in countries where mobile phone networks are the main forms of transmission rather than fiber circuits.

This form of delivery will increase as 5G roll-out continues round the world, offering low latency streaming and the ability to connect several different devices. Network operators are also likely to take advantage of what the new technology is able to offer, as are traditional broadcasters. With streamed viewing now increasingly on an equal footing with linear broadcasting as the main way people watch video, the mainstream channels will launch their own OTT platforms, either independently or in conjunction with production companies and content owners.

The key selling point for VoD is quality of content but that should apply to the audio and visual standard as much as the excellence of the programming. That, coupled with more channels available to an ever-increasing number of platforms and devices, demands efficient and comprehensive monitoring of the streams to ensure both regulatory compliance and delivery to the right destinations.

Audio and video need to be tested as they pass along the distribution chain and through CDN (content distribution network) edge points. Due to the number of streams involved, this is a massive undertaking, made more complicated by the fact that attempting to monitor streams only in a standard master control room (MCR) does not make much sense in the multi-stream, multi-channel world. Ideally, monitoring should happen at every point along the chain but in the OTT realm, controlling the material is difficult once it is being handled by the big service providers.

As a result, the MCR of a broadcast or playout center is no longer the final point for quality control. Video and audio have to be examined as they travel along the delivery path via CDN edge points and undergo various processes, including targeted ad insertion, multi-language selection and event-based transmission. It is crucial to fully manage this ever more complex process, which now requires logging and compliance checking.

Monitoring the whole distribution path called for a brand new methodology. The aim is to go back to the more traditional system of starting with content 'off air' at the broadcast or playout center by creating something that can be virtualized and customized for individual requirements. Established detection methods, including time and date searches and predefined metadata, are still valid and used widely. But using more sophisticated software-based techniques, such as watermarking and fingerprinting, operating under automated computer control, is now practicable.

Due to the high number of OTT channels, many of which use adaptive bit rate (ABR) techniques, to be monitored, using display panels is no longer practical. There are now many devices and delivery platforms to monitor but viewers expect broadcast quality and the level of service to be the same as what they have been used to. Mediaproxy's LogServer compliance monitoring and analysis platform is able to check not only that the content of programs and commercials conforms to regulatory requirements for quality - including picture consistency and audio loudness - but can also assist in collecting information for targeting commercials at a particular audience.

The goal of broadcasters or content owners in today's extremely diversified and competitive OTT market is to ensure content is delivered to the quality laid down by regulators, with programs passing through the CDN to devices and platforms in the correct format. Mediaproxy's technologies provide real-time monitoring of the output program so the broadcaster can be confident that what is being sent is correct in terms of both content and compliance.

Today, the business of monitoring outgoing channels is moving away from operators in the MCR looking at display screens and being replaced by an exception-based methodology. Modern MCRs will instead have multiviewers, such as Mediaproxy's Monwall, which only brings up a channel when something is wrong. The operators are then able to isolate any problems and deal with it using the tools provided by a compliance system like LogServer.

The advent of OTT has given people the freedom to watch anything wherever they are and whenever they want. Because of this technology developers, including Mediaproxy, have had to come up with new ways to make certain that the many streams now available meet the expectations of both viewers and regulators. This is an ongoing process and one that will continue to evolve as streaming becomes more dominant in the media world.

Let’s Work Together – How Strategic Alliances Are Facilitating The Advent of TV As A Service

Guy Taylor

Head of Client Services at M2A Media


No, this isn’t a paean to Canned Heat’s “Let’s Work Together”, but it could be the unofficial theme song for an industry that is going through a major period of transition. The cloud, be it public or private, is increasingly becoming the locus of operations for broadcasters and there is an inexorable shift towards the acquisition of cloud-based services and products. The cloud, in turn, provides a common platform to build these services upon. This common platform is further enriched by a framework of APIs and tools. This creates a technical lingua franca, where previously siloed products can effectively communicate with each other and opens up a universe of possibilities for collaboration. In addition, smoothing the path of collaboration, are broadcast standards, such as SCTE-35, SMPTE-2110 or CableLabs.

These common cloud-based architectures and broadcast standards continue to facilitate a growth in strategic partnerships between broadcast technology providers, who may have previously existed in isolation from each other. The outcome of these partnerships is a suite of complementary workflows, such as cloud-based editing software seamlessly feeding in public cloud based distribution services. The ease in which these formerly disparate services can interoperate with each other, which, had they been hardware, would have required miles of cabling and various discrete storage systems and tooling to move content around, is doing away with the need for expensive and time-consuming systems integrations.

This approach underpins the concept of TV-as-a-service (aka TVaaS, a pretty clunky acronym admittedly!). TVaaS is, broadly speaking, a set of cloud-based services delivering discrete components of a broadcast workflow and with the advent of TVaaS a broadcaster can now feasibly run an operation that is 100% cloud based. TVaaS allows a customer to choose best-in-class services that facilitate near endless scope for scalability, something which hardware could only match through eye watering amounts of capex spend.

The AWS Marketplace is perhaps the most visible realisation of the concept of TVaaS.  This October M2A Media joined an increasing number of  vendors in the Media & Entertainment space on AWS Marketplace, to offer cloud-based SaaS broadcast solutions, ranging from playout to live streaming, acquisition and distribution, DRM, video analytics and much more. A new, greenfield broadcaster, on a constrained budget, could quite easily launch a service with technology acquired almost entirely via the choices made on the AWS Marketplace! Buyers through the Marketplace can be assured that vendors are AWS approved and warranted to provide high levels of service and reliability. Pricing is also visible and transparent, reducing the need for tortuous negotiations and contracting.

As the number of vendors offering TVaaS solutions increases, there is a greater need to rise above the flock to convince a potential buyer of the value you can bring to them. At M2A we are realising that not only will this be achieved by offering brilliant products, it will also be achieved via strategic partnerships. Our recent partnership with InSync and Hiscale is evidence of this.  We have integrated their cutting-edge technologies into our M2A CONNECT product to launch the very first cloud based, motion-compensated live frame rate converter, which operates on a pay-as-you-use basis. Customers choosing M2A CONNECT for the global acquisition and distribution of their live content can convert frame rates as needed and at scale, a video transition that was previously dependent on hardware. Another collaboration, this time with Ostmodern, means we can offer customers high-quality front-end solutions for their M2A orchestrated live content. Our partnership with AWS, via their Media Services and the aforementioned Marketplace, allows M2A to innovate at a pace that keeps our product and services, and subsequently, customers ahead of the market and able to meet the challenges of contemporary broadcast operations.

To recap, through collaboration we can deliver cross-vendor solutions and products that are proven to succeed. We can work together to assure the stability, security and reliability of our respective offerings. We can work together to deliver dynamic services at previously unseen levels of flexibility and choice. We can work together to make buyers a compelling offering, that reduces the overhead of wrangling multiple suppliers together, but also frees them from the restrictions of contracting to a single vendor platform. Working together allows the sharing of mutually beneficial opportunities, which encourages growth and fuels innovation. Working together and facilitating the sharing of ideas and knowledge is to the wider benefit of the broadcast industry.

Amidst the disruption and isolation of the pandemic, we’ve all realised the value of kinship and collaboration. Here at M2A we are taking that on board and are looking forward to a future of strategic partnerships, alliances and, hopefully, a new friend or two.

Come on now people, let’s work together.

News: cloud native is the way forward

John O’Loan

CEO, iOMedia Group Limited


Was it Roy Thompson of Scottish Television who saw the dawning of commercial television in the UK as “a license to print money” - or was it Lew Grade at ATV ? It’s been contested, but it doesn’t really matter, because they and the lucky winners of the other, early tv licenses, were all right – back then.

But as Moses Znaimer, the Canadian media visionary was to lament 20 years later, “yes, but first you have to get your license.”

Certainly, times have changed. You no longer need to have a much sought after broadcast monopoly license of the type afforded to Thompson and Grade in Britain and withheld from Znaimer in Canada, or the many other would-be broadcasters around the world for many decades. As Lew Grade’s nephew, Michael was to find when he was the CEO of the successor to ATV, Scottish and the other U.K. commercial broadcasters at ITV, a license no longer guarantees the “rivers of gold” it once did (also ascribed to Roy Thompson at the time).

So what has evolved in media economics and what keeps changing? In a word “technology”, because of which you now longer need a license, nor even a mountain from which to use it.

Let’s just take a moment to look at where we came from, how we got here, and what that means in the financial evolution of the media.

Three major developments in more recent years have changed the world Thompson and Grade relished, and Znaimer hoped for.

We’d learned to cope with satellites, digitization and broadband, all of which offered never before dreamt of bandwidth for unlimited numbers of media channels and receiving and interactive devices.

But just when we thought we were learning to cope with the financial and business opportunities being created, along comes the Covid pandemic. The worldwide need for lockdown sent audiences for old fashioned broadcasting and newfangled streaming sky high, to record levels. It also sent us headlong into “remote location production”. Particularly for news media, this in turn is resulting in a reappraisal of the high Capex, long amortization lifecycle of the technology needed. We are moving from a Capex world to an Opex lifestyle.

Covid caused a world panic like no other. As in previous mass social upheavals, such as wars and panic stations, necessity became the mother of invention, and adoption of technologies such as “the cloud”, which may have otherwise taken another cycle or two to catch on, were hurriedly pressed into use.

Change specialists teach that there can be no systemic change, or evolution, in any system, unless there is a well understood belief that change is in fact needed. Soaring use of media, coupled with a never expected plummet in media revenues, mainly due to a fall in advertising and marketing activity, forced the reality that changes needed to be made – and they are.

Often this required expedient, temporary “fixes” to cobble together legacy technology as best as could be achieved for remote operation or changed production and distribution circumstances. We’re now starting to realize those changes which we may have anticipate as being “temporary” are actually becoming the “new normal.”

WARC is a file format for the long-term preservation of digital data. As brands and media operators shift their marketing and distribution strategies from temporary adjustment to permanent transformation, the recently released WARC Marketer’s Toolkit 2022: Global Trends Report, which brings together insights from a survey of 1,500 global executives, reports that, “Far from signaling a return to normal, the opening up of economies emerging from lockdowns, has only created a new set of challenges for marketers. Attitudes, behaviors and market structures have resulted in significant change during the pandemic. With vaccination rates rising, many parts of the world are starting to see a return to what we used to assume was ‘normal’. However, even in these markets, consumers are rethinking and evaluating lifestyles, resulting in different behaviors, preferences and patterns in their use of media and technology.”

The international media research giant Kantar can also detect that the video streaming subscription model, which rose mightily during the harsher parts of the pandemic, is starting to lose its power to drive long-term growth. In its latest Media Trends report, Kantar forecasts that the fight for audience numbers will drive a further diversification of business models in 2022, with a sole subscription offer becoming scarce. Kantar predicts further industry consolidation as platforms seek to offer more and better content.

Technology is giving consumers exponentially more news and entertainment options, and largely changed how people discover new content; which providers they get it from; and how they pay for it (if they pay, at all). As Kantar point out, the ultimate competition for media operators has become the one commodity that is no longer expanding: people’s disposable time.

The research organisation Hub was set up to study the intersection of technology and entertainment.  

Three key trends emerged from Hub Entertainment Research’s latest Conquering Content study, which tracks how consumers discover TV content – and the platforms they use to watch newly discovered shows and movies:

The Hub report shows streaming’s advantage as the home for favorite shows continues to grow. Consumers are now three times more likely to discover a new show on a streaming platform than on a traditional network.

Among TV viewers who have discovered a new favorite TV show in the past year, 75 per cent say the show they’ve discovered is on a streaming service. Only 21 per cent have discovered a new favorite from a traditional pay-tv source such as DVR etc.

The proportion discovering a new favorite on streaming has increased every year since Hub have been tracking viewing behaviors, while the proportion discovering their latest favorite show on a traditional service has declined every year.

Another legacy of the COVID-19 pandemic is likely to be a fundamental and continued change to how and where journalists do their work – as well as a renewed focus on recruitment, retention, and diversity. It may not happen overnight, but news organisations are rethinking what the office is for and what kind of opportunities that throws up. Do they really need large production centers?

The Oxford based Reuters Institute survey notes that remote working has “made newsrooms more efficient, and that many employees also value greater flexibility, but it is also clear that people miss the creativity, collaboration, and communication (3Cs) that is the lifeblood of any newsroom.” The key question, it says, is how to strike the right balance between those features.

As many of the Reuters Institute interviewees noted, the hybrid future is about much more than just enabling any greater employee rights to remote working. It goes on, “In an ideal world, it describes a new operating model where work is done without reference to location, where talent is used more effectively, where hierarchies are less formal, and where diverse groups are included in conversations. It’s also likely to involve a greater amount of face-to-face contact with colleagues, whether that is just to socialize, reinforce company culture, or collaborate on creative projects.”

Some news organisations are just starting out on these journeys, while others are already some way down the line. All however are paying more serious attention to technologies such as the cloud, that only 18 months ago they may have been expecting to engage with in another 18 to 24 months from today. Instead, they’re evaluating and implementing – and using it - now, out of the combined necessities of reduced costs, regulatory changes allowing more amalgamation of once separate media units, reduced space and real estate requirements, cleaner environments, greater security from hacking and ransomware attack.

Another financial implication of the aftermath of Covid-induced and hastened media evolution is system security. “Hubbing” where the operations of co-owned media operators are combined in one location, is another development, brought about by regulatory changes in recognition of the fact that we don’t need to restrict media ‘licenses’ in the same way, anymore.

The Sinclair Broadcast Group is one of the largest owners of local TV stations - 184 stations in some 86 markets - and the largest owner of regional sports networks in the United States. On October 17th the company was the victim of a ransomware attack that took much of its local programming off the air, lost its commercial load and had data stolen from the company’s servers.

As media operators seek further synergistic interconnect, such ransomware attacks have added new financial and operational fears to the evolving media of today – and tomorrow.

At the iOMedia Group Limited we believe the safest way around most of the issues the industry is facing is via the Cloud Native route. That’s why we are releasing the LNS Cloud 9 newsroom system at IBC 2021. You can see it in action at www.livesystems.io where you can also arrange for a private demo. 

John O’Loan is CEO of the iOMedia Group Limited, which releases the cloud native LNS live news and sports control system in Amsterdam at IBC 2021. He was responsible for the launch of Sky News and was also instrumental in the launch and running of Sky TG24 Italy, STAR News China, STAR News India, National Geographic Channels and FOX International Channels and as an independent consultant worked with more than 40 media brands worldwide.

He is a graduate of Culture Change studies at the University of Oxford and HEC Paris, co-founder of the Change Leaders Group and continues independent consultancy for media companies internationally, including NDTV India.   

How will technology continue to push changes in OTT in 2022?

Jérôme Vial

Business Development Director, iWedia


The world of OTT has seen a rapid evolution in the past years, and the Covid-19 pandemic only accelerated this growth as access to news and content became the unconditional form of entertainment to many households during the most severe of lockdown restrictions. There is a clear evolution towards a free, ad-supported TV service, and as the world begins to turn the page towards a new normality for 2022, operators face a number of challenges to engage and retain their customers in a crowded market.

The vast complexity of systems needed to deliver a quality OTT service should not be underestimated, so operators should look for companies, like iWedia, one of the leading providers of software components and solutions for TV devices, offering global expertise that can help demystify and simplify the process.

According to iWedia, a key area that TV operators must address is the added value they bring to customers. With the increasing influence of streaming platforms such as Netflix or Amazon, operators have increasingly become a ‘shop window’ for many streaming platforms, and the challenge at stake is that they must succeed on their added value and relevance to their customers’ needs; it is all about the quality of the user experience (UX). Customers do not want to be swapping platforms to search for content, and there is an increasing demand for fast and simple content aggregation from all platforms. The value for operators here is to enable customers to search across multiple platforms in one ‘click’ to find that film or TV show.

Along similar lines, operators should also make sure they are providing a comfortable, painless user experience to their customers. A well-designed interface with easy commands and simple to use search functions will reward operators with engaged and loyal customers. With the growth in the market for voice assistants, it would also be valuable for the operator who pushes its own device in the household to provide a reliable and resilient voice command solution.

Technologies are available today to develop smart a User Interface (UI) which will adapt over time to the user’s profile and behaviour. Users will not realise that they are having a good experience, but they certainly know – and act accordingly - when it’s a bad experience.

An efficient recommendation engine is also a very valuable service. Obviously, some very good progress has been made in this field in the past years and as the data grows, the recommendation engine improves over time. Despite this, users will still get random and possibly strange "recommendations" made by streaming services! Operators need to use content metadata, even live content and customer profile data, and use technologies such as artificial intelligence to determine viewing preferences in order to provide more accurate recommendations, which are welcomed by customers.

Another big challenge for operators is how to enable new revenue streams and diversify these. The major trend in this aspect is turning to targeted ad insertion. Originating in digital advertising, this new technology directly solves the problem of diversified income sources and creates new revenue streams for many operators.  Although it is undoubtedly a highly complex technical task, it is an excellent opportunity for operators to differentiate themselves from international streaming platforms and create new lucrative sources of revenue. Likewise, user experience will be the key to making it a reliable value.  iWedia’s cloud-based Ad Insertion Platform vastly simplifies the process; with the choice of client-side ad insertion (CSAI) or server side ad insertion (SSAI), the solution seamlessly replaces ads in live, catch-up and archived linear TV broadcasts, using individual targeting profiles and WEB-advertising technologies including real time bidding and return-path data. iWedia acts as an integrator at every level of the project implementation, speeding up time to market and lowering the cost of entry.

Free, ad-supported TV services is a growing trend that will be able to push more content to end users. It presents a great opportunity for platforms which will be able to become more relevant with these new models and services. Naturally, these UX elements and value-added services require significant technical integration at the back-end, and iWedia is looking forward to seeing how this exciting trend grows in the coming months and years.