Realizing the potential of the cloud for live video production

Robert Szabo-Rowe 

VP Engineering and Sales

The appetite for compelling live content is soaring, particularly live sports programming. Rethink TV has estimated that global sports rights will grow by up to 75 percent between 2020 and 2024, making it clear that live content remains king. This trend is particularly evident among younger audiences, with Nielsen figures showing that two-thirds of 18-34-year-olds watching linear TV preferred to watch live content. According to the Interactive Advertising Bureau, the same proportion of all consumers worldwide is now streaming online coverage of live events. This demand is, in turn, putting pressure on broadcasters and other rightsholders to adapt their workflows to meet greater and more varied needs for high-quality live content.

The fact is the way in which consumers engage with live content is evolving and, as a result, the media landscape is rapidly fragmenting. Even before the lockdown and restrictions posed by the global health crisis, streaming services witnessed a surge in popularity. COVID-19 has only accelerated this trend, as audiences under shelter-in-place orders have turned to streaming platforms as their source of news and entertainment. Twitch, the leading esports streaming service, recorded an historic 83% year-on-year jump in viewing during the second quarter, according to a report from StreamLabs and Stream Hatchet. Social media is also emerging as a primary source of live content – 22 percent of fans now seek live sports via social media platforms, research from GlobalWebIndex shows.

The upshot of this media fragmentation is that broadcasters and other live content producers now face a fresh challenge: finding cost-effective ways to produce and deliver more content in increasingly dynamic formats – from shoulder programming to coverage of smaller and niche events. Efficiently filling the growing number of programming hours across an increasing array of platforms, while still meeting audience expectations for high production values, is the tough task content providers face today. In light of these developments, cloud-based production models have emerged as a compelling option.

Embracing a new era of content

Media organizations must move towards a future in which they can leverage new approaches and technologies to produce and deliver more live programming than ever to an increasingly diverse audience – and achieve this without overstretching resources. Cloud-based production answers the call to this need, enabling media players of all types and sizes – across all geographies – to remain competitive in an industry that is more dynamic than ever.

Basic cloud production tools enable workflow processes to be implemented via a hosted platform. To meet more comprehensive needs, cloud-based 'Production-as-a-Service' offerings, such as The Switch’s MIMiC platform, can deliver an end-to-end service that includes everything from remote IP video contribution and production to clipping and distribution. The on-demand production model enables the entire production workflow – including editing, graphics creation and comms – to be handled within the cloud. Then distribution via private fiber networks or over-the-top (OTT) services ensures that produced feeds can be delivered to viewers in any country on any platform.

The cloud approach to production offers clear cost advantages, but other benefits can be equally important to content producers. Next to cost, flexibility can be a critical benefit. With a cloud solution in place, content producers can quickly adapt to any circumstance, regardless of the event's location, crew members whereabouts, distribution method and target content. IP-based networks are increasingly becoming the dominant means of distributing new content, making it easier to adapt existing workflows to cloud-based technologies and software-defined architectures without making physical changes to the hardware.

During the COVID-19 pandemic, we have seen just how important adaptability has become. With crews forced to operate within lockdown conditions, social distancing guidelines and other safety restrictions, many editors have needed to be able to operate from remote locations – often using just a browser to tap into cloud production capabilities. Bringing together video, IP networks, and cloud-based tools on the same platform provides a powerful combination that allows production staff to support a range of content outputs, each with its own specific requirements.

Optimizing efficiency, speed and reliability

Cloud-based production delivers a quick pace and level of efficiency that is critical to successfully producing live or even virtual event coverage today. The cloud approach ensures proficiency by taking the complexity out of live event coverage. It makes it easier to staff crews and sort other logistics regardless of location, all while minimizing the need for travel, freight and extra resourcing on the ground.

The quick turnaround of highlights, replays and social media posts is another major advantage of the cloud approach. Near real-time production capability is becoming increasingly important as consumer habits evolve. For instance, 75 million sports fans in the United States regularly watch highlight packages, according to Ring Digital's 2020 Future of TV Survey, with many fans looking for instant summaries of the best action when they don’t have time to view whole games. Fans demand for packaged clips across all platforms for sports, and even other live events such as awards shows, means that a delay of minutes – or even seconds – opens rightsholders to the risk of losing out to rivals and pirated content sources. A cloud approach helps minimize such threats.

Reliability is also crucial for live TV. A cloud-based production environment can run transparently, securely and independently of the main broadcast feed from a major event, allowing existing workflows to run as usual. In cases where the cloud workflow is the primary production and distribution method, its highly virtualized and microservices-based architectures eliminate the risk of a single point of failure. Cloud production can also be architected to offer a pass-through backup that can go straight from contribution to encoding, then CDN distribution. This feature acts like an override switch to avoid a 'black screen' situation.

Kick-starting the industry transition

Broadcasters, rights holders and other content producers face many challenges in today’s live production landscape where new approaches are essential. In a world where consumers expect richer, more varied content experiences, cloud-based production has come into its own. It offers a cost-effective, flexible, efficient, fast and dependable way of enabling content producers of all types to meet consumer demand for professional quality across multiple platforms. Innovative broadcasters and production companies are already leveraging cloud-based production benefits as they seek to grow their content offerings quickly and efficiently.

Cloud production, like many internet-driven innovations, is easy to deploy and provides a technical architecture that works alongside existing broadcast workflows, without putting well-established processes at risk. With remote production emerging as the workflow of the future across a whole spectrum of live TV, streaming, and social media events, cloud-based production and services will play a growing role in transforming an industry moving progressively to an all-IP future.

Driving the consolidation of media transport and enterprise services into a single WAN

Angus Stewart 

Business Development Executive – Media, ANZ

While the industry was already increasingly moving towards models of remote and distributed working in all aspects of the workflow, the events of 2020 have provided a dramatic accelerant to this trend. The requirements of Covid-mandated social distancing protocols have added to the already compelling economic arguments for remote work, and meant that media organisations are looking for solutions that can accommodate the data flows of the new IP-based broadcast ecosystem reliably and securely with the high performance criteria that broadcast video — especially live broadcast video — requires.

We are moving towards a new era of hyper-efficient and cloud-enabled media production and what is required are systems that offer the following criteria:

Reliability. The system must be bullet-proof and offered with the SLAs that the broadcast industry is used to working with.

Performance. Any network aimed at the broadcast market needs to be high-bandwidth, high-performance, low latency, and virtually zero delay variation.

Coverage. Depending on the market being served, reach needs to be genuinely national and/or international.

Standards. Any network needs to support SMPTE standards for delivery of media over IP and the consolidation of data network payloads. ​

Security. Assuring the integrity of content and content providers’ Intellectual Property is a must.

Control. Media organisations have a requirement to manage the network to reflect their own business needs, for example SMPTE media streams with enterprise data traffic.

Our contention is that MediaWAN, a brand-new service from Telstra Broadcast Services, exemplifies the new breed of network that is required and will serve as an enabling technology of the new remoted broadcast methodologies. Currently specific to Australia, it delivers a single country-wide core network for IP contribution on a national basis, managing both enterprise data and video with premium SLAs. It allows the customer to transport real-time video, audio and data between geographically diverse facilities and consolidate their networks, cost-effectively blending Enterprise and Media Network functions and resources. ​

Built from best-of-breed components that are sourced as part of our established relationships with leading global technology providers, as well as the strong support of local channels and partners, it allows for the transmission of real-time video, audio and enterprise data between critical media hubs at 10Gps to 100Gbps on dedicated bandwidth. This is delivered as IP data flows with granular control over connectivity; frames are encapsulated and transported across the network transparently and delivered in native form.​ In keeping with the level of SLAs expected within the industry it also features end-to-end management and 24/7 support from dedicated teams using advanced management systems.

Benefits of a media-specific WAN

There are multiple benefits to the usage of such a network for media companies.

From the outset, it allows for the consolidation of multiple networks into one, single network. Multiple networks are an unfortunate fact of life in modern broadcast production and can even feature in single productions, with the problem exacerbated as they proliferate across media companies. By coalescing these into a single network, this dramatically reduces complexity within a company and lowers the costs of dealing with a wide range of network issues.

This has benefits that reverberate through the chain. Level 1 service assurance is also guaranteed by the fact that a single network results in a single point of contact for broadcast customers. Many media companies’ network operations have been built up in an unplanned manner over time, and separating these multiple connections into their core components can be difficult, especially, for example, when there are issues involving live contribution and the top of the hour is rapidly approaching. A single point of contact helps give media organisations the confidence that when problems occur, they can be identified and fixed swiftly with nominal disruption and in a response time that is consistent with the SLAs of the project

This is worth expanding upon as the network is also maintained at a broadcast-level SLA. This is a point that can be overlooked in many negotiations regarding network cost. The delay of data in many IP-based industries is often considered inconvenient rather than mission critical, and total data loss is not unheard of. To be truly robust enough for broadcast operation the data in the network streams must be treated with as much care and attention to redundancy as real-time SDI video signals were in the pre-IP era.

Cost savings can be impressive using such a system too. The use of a single network not only consolidates all of an organisation’s disparate network operations into a single supplier and point of contact from an external point of view, from an internal one it sees the consolidation of the various internal teams that have been called on to manage a wide brief that covers potentially many different companies. Again, this speaks to minimising complexity within an organisation and at the very minimum the removal of duplication of effort. Furthermore, it effectively enables companies to outsource many of the day-to-day tasks of maintenance, lifecycle management, third-party vendor management, monitoring, notification, and network restoration that are attendant with the management of multiple networks.

Such cost-effectiveness is an important attribute of a managed media WAN. While the business benefits of remote workflows are well documented, the costs that lurk in the details of the movement of data around multiple high-speed networks can be surprising and commercially difficult to justify. The consolidation of multiple networks into one can not only provide multiple benefits such as enhanced control over IP data flows and enhanced SLAs, it can do it whilst maintaining a cost-effective price point that is typically significantly more economical than an ad hoc system consisting of competing suppliers.

The specifics of any individual network will vary. The Telstra MediaWAN provides national coverage based on one of the largest fibre footprints in Australia providing speed options of up to 100Gbps on Government-grade infrastructure. In other countries the specifics may well change, but the philosophy remains the same; a dedicated fully managed network with high-performance SLAs is going to reliably outperform any of the alternatives, for example, dark fibre which typically features reactive support and un-managed service. Perhaps more to the point, the hyper-efficient media factories of the future will demand it.

How to Navigate Today’s Bandwidth-Constrained, Multi-Codec World

Bart Van Daele 

Product Marketing Manager, Video Network

In many ways, 2020 has forced us to rethink norms we’d previously known to be true. But when it comes to video delivery, the primary concerns and demands from 2019 remain the same, if not elevated by the pandemic: bandwidth is still limited and consumer demand is still growing. And with people continuing to consume increased levels of content at home, the bandwidth challenge remains. As we look ahead to 2021, content distributors and video service providers must ask themselves: How do we deliver high-quality video at scale efficiently, while maintaining a high-quality user experience? The answer is a bit like 2020 – it’s complicated.

According to research from Conviva, viewers around the world spent 57% more time in Q3 streaming as compared to last year. In fact, every continent saw double to triple-digit increases – including a 104% increase in South America. With consumer demand for content not slowing down, content distributors and video service providers must take a serious look at the technologies and expertise available to help them navigate today’s multi-codec world, despite bandwidth constraints. This is a must not just for short-term needs, but for longer term, sustained growth.

Thankfully, both technology and expertise are now more available than ever before to help address such hurdles, particularly related to compression requirements needed to enable efficient and effective video content delivery to existing subscribers, and to help grow a subscriber base. Particularly in terms of software, service providers can leverage unprecedented flexibility in implementing codecs that identify and compress media content and automatically adjust bitrate and target quality level. So, what needs to be done first?

Content distributors and video service providers must ask themselves:

  • Are we properly scaling to address the viewer demand?
  • How can we scale preemptively? What do we need to take into consideration?
  • How can artificial intelligence (AI) help us scale ahead of time?
  • What can we do to ease the pressure on our network due to more demanding video applications?

A common denominator to each of these answers is addressing bit rate requirements for high quality video delivery. In the winter/spring of this year, many providers quickly reduced video service bit rates in anticipation of bandwidth concerns. While this took care of the immediate network congestion problem, it was ultimately a band-aid solution to a much deeper problem of scale. Most video services are still searching for the right route to address the bandwidth crunch. Added to this challenge is that new compression standards are emerging every few years, forcing the need for greater agility within a network to support faster reaction times and superior scalability for the ultimate viewing experience.

Enter the cloud

The cloud offers the scalability and flexibility needed to master the complex compression needs that result from increased viewership. It’s a clear case of supply and demand. Consumers want more content where they want it, when they want it. Content distributors and video service providers need the capabilities to help ensure this happens, or risk losing the subscriber.

From a compression technology vendor perspective, the ability to develop and automate the testing of multiple codecs in the cloud - and at scale - is truly game changing. It makes it possible to speed up development and to measure compression efficiency across a wide variety of test cases including different codecs, coding structures, coding tools, bit rates and resolutions.

The cloud also gives both compression tech vendors and service providers a huge advantage in scalability without CAPEX investment. However, when it comes to cloud deployment, there is often a perception of unlimited scale. Cloud services are not free, so it’s critical that codecs are optimized for efficiency – both in terms of cost and capabilities. Codecs developed for the cloud and in the cloud can allow providers to enjoy the benefits of truly agile cloud deployment.

When it comes to compression, it’s pretty simple math: the more you compress, the more you can store and the more you can transmit. But while simple, it is not easy. And that’s why content distributors and video service providers should look to partner with outside experts and technology innovators for success.

Considering Codecs

There’s no doubt this multi-codec world puts ultra-high quality, low latency video within reach, at scale. However, as the codec landscape becomes wider and more developed, it also becomes more convoluted as more encoders and decoders must be supported. In terms of encoding and client support, different codecs have different ideal applications – and often, several codecs run in parallel. Advanced Video Coding (AVC) and High Efficiency Video Coding (HEVC), for example, can enable low-latency live streaming for large scale events. But for enhanced compression performance and support for emerging broadcast technologies, added codecs like AOMedia Video 1 (AV1) can make the software stack more robust.

Several factors come into play when it comes to considering codecs, including delivery requirements, client and device support, footprint cost, OTT or broadcast encoder complexity and more. Not to mention the increasingly complex legal landscape for licensing codecs, and current momentum around royalty-free codecs. And with new displays constantly coming to market promising better and better picture quality, keeping up with the latest and finding the right compression route is no easy feat. Fortunately, there are experts who can help navigate the tumultuous waters of video codecs to enable service providers to take advantage of compression efficiency for the best video experiences.

How to Navigate

Next-generation video specifications, which are rapidly being brought to market, certainly make video compression more efficient, but also involve significant encoder complexity.

AI and Machine learning (ML) technologies allow compression technology providers to find a balance between bandwidth-efficient video streams and encoder complexity. Using AI-based models, it’s possible to enhance compression engines and optimize bandwidth savings based on content while reducing computational complexity. Machine learning facilitates decisions both at the rate control algorithm level and encoder level, thereby saving bandwidth costs and strains, and simultaneously speeding up processing.

Another technology to consider is Content Adaptive Encoding (CAE) by scene. This approach can further lower the bandwidth requirements while also deliver quality that is nearly constant perceptually. There are many expectations today, such as globalization and low latency that can lead to an explosion in the computational requirements of encoders. Those looking to expand into new geographies can benefit from CAE and from working with a technology partner who understands the ins and outs of the various codecs, implementation approaches and scalability requirements to help ensure high quality of experience (QoE) – all of which can be done as the business expands its reach.

The digitization of media has led to a constantly growing market demand. Today, the trend continues with the evolution of technology that offers increasing screen resolutions, frame rates and dynamic ranges - not only on televisions but also on mobile devices - and various modalities such as VR, multi view, 3D and 360 TV. In fact, the Cisco Visual Networking Index found that by 2022 IP video traffic will be 82% of all video traffic. More than ever, finding the right technologies and expertise to help avoid network congestion must be a priority for video service providers. Analyzing what subscribers are looking for, tackling the bandwidth challenge without sacrificing quality and leveraging advanced technologies to improve compression efficiency must all be factored into the equation to achieve both customer retention and acquisition.

We’ve learned in 2020 that navigating the future while planning for the unexpected is never easy, but it is mandatory. Content distributors and video service providers only see success if their viewers are happy with their experiences. In order to continuously deliver, companies need to leverage the right mix of technology and expertise so that in the end, they and their subscribers reap the benefits of today’s multi-codec world.  

Special Report: What’s Driving Change in Store?

The Store segment of the BaM Content Chain® covers the storage of content throughout its lifecycle. This can be on-premise or cloud object storage, SAN and NAS – including disk, SSD, optical and data tape, as well as storage management, archive storage, video servers and VTRs.

With an ever-growing amount of content needing to be stored and then rapidly accessed from anywhere, with ever-higher resolutions only increasing the pressure, Store today means much more than the simple repository its name implies. We spoke to 10 IABM member companies to catch up on all the latest developments in this largely unheralded but vital content chain segment.

“Ever increasing content production in higher resolutions (e.g. 4K and 8K), at higher frame rates (e.g. 120Hz) and greater dynamic range (e.g. HDR) means that there is a continuing need for ever more performant storage solutions,” says Paola Hobson, Managing Director, InSync Technology Ltd.

Access vital

The coronavirus pandemic has shone a spotlight on the Store segment of the BaM content Chain®, with access becoming a key issue. “Lack of live drove broadcasters to the archive but many found it difficult to get timely access to content,” says Nick Pearce-Tomenius, Sales and Marketing Director, Object Matrix. “[What’s required is] ease of access to all content from anywhere without manual intervention. This was not possible for many during this troubled period.”

Adrian J Herrera, VP of Marketing at Caringo also identifies access as a key issue. “The primary driver of change for both end-users and vendors this year has been content accessibility. Early this year, the pandemic restricted access to data centre facilities and offices and, as a result, access to content needed to complete projects. Many organizations accelerated their migration to the cloud and many vendors accelerated their product roadmap features that facilitated migration to the cloud.”

Cost matters

“The issue from the end-user perspective will be the cost associated with cloud storage. When you look at cloud computing, you can spin up and then spin down compute resources when they are no longer needed—reducing costs. However, storage or content is more static and tends to compound. As cloud service bills increase, the only way to reduce costs will be to delete data or to keep the majority of content on premise,” Herrera says.

“Cost is never far from the top of concerns and with continued exponential content growth; the storage budget is often stretched to its limit,” says Jeff Braunstein, Director of Product Management, Spectra Logic. “It would be easy to say that cost is the main driver of change in Store, but that answer is too simple. The rising cost of storage in media and entertainment is not merely attributable to increased storage procurement costs. In fact, the cost of storage has consistently dropped over the last number of years. The key part of the issue is suboptimal data management. The rising cost of storage is almost always traceable to the hidden costs associated with the improper retention and management of large amounts of digital content, including backup storage capacities that are often a multiple of actual production data, daunting inventory tasks, complex storage management, shortage of skills and quick data availability requirements. Organizations need a cost-efficient storage lifecycle management tool that brings visibility and analytics to data for proper and intelligent tiering of data relative to its perceived value and access patterns.” Braunstein is not the only correspondent calling for structured handling of data.

Collaboration is key

“In my view, the industry's top priority, even if it's not always clearly defined, hasn't really changed - to me, it’s always been fundamentally about collaboration,” says Alex Timbs, Sr. Business Development Manager for Media and Entertainment at Dell Technologies. “This means different things in our respective businesses but is likely key to almost all the M&E creation, distribution and consumption strategies that have, and will, succeed. It's about quality, quantity, and efficiency, and now more than ever, about the value technology can deliver. In some cases, this means granularity, i.e. the ability to match resources surgically to the business need, in others it’s about consumption models, or flexibility to deal with short, yet very high amplitude waves of infrastructure demand.”

Collaboration is key for Studio Network Solutions too. “The drivers of change in storage are all about workflow,” says Melanie Ciotti, Marketing Manager. “From speed to security, remote connectivity to NLE integrations, innovations in storage stem from the industry’s need to collaborate better, create faster, and work more efficiently.  It’s all driven by workflow. For example, at SNS, we developed Nomad and SNS Cloud VPN to help our users work from home at the onset of the COVID-19 pandemic. Foreseeing their workflow challenges drove us into immediate action toward a solution, and we continue to innovate based on our users’ current and anticipated workflow and storage needs.”

“Of course health and safety for productions is bending the curve towards virtual collaborations and remote workflows for everyone,” says Dan Montgomery, CEO of Imagine Products. “One trend we noticed this year is a definite uptick in short-term software leasing. This use model has all but replaced the traditional 'permanent' ownership scenarios.”

Ethernet workflows

For Daryl Heinis, Scale Logic CTO, high performance ethernet-based workflows are top of the list. “Whereas the cost of fiber infrastructure has become prohibitively expensive, we see the cost of 25 and 100Gbit ethernet and NVMe decreasing rapidly. We have applications supporting high performance workflows using ethernet changing how content is edited at high speed. We have AI and other data analytics driving automated workflows, which call for single global namespaces and data movement to cloud. We have remote personnel driving change on how content is edited and accessed in parts of the workflow, where previously it was done on-premise only.”

David Phillips, Principal Architect, M&E Solutions at Cloudian, identifies two drivers of change: “The real drivers are at the opposite ends of the Store spectrum. On the one hand the ultra-performance capabilities that NVMe devices and fabrics are introducing to Tier 0/1, and at the other end, the supremacy of WAN accessible S3 object storage as a scalable repository for all unstructured data.”

Tumbling prices, increasing competition – how do Store vendors differentiate themselves?

Cloudian’s David Phillips thinks that customer service remains the key differentiator: “I think there will always be a strong market for enterprise storage vendors that bring to market the economic advantages of open-source software and commodity hardware, backed by passionately dedicated engineering and customer support teams. There are many product offerings that offer similar capabilities, often at similar price points. Ultimately what makes customers continue to invest in any product offering is a positive customer experience, and that is why we are very proud to have some of the highest customer satisfaction scores in the industry.”

Workflow and collaboration

Customer satisfaction is also important for Studio Network Solutions. “Our 99% support rating and worldwide reputation for reliability helps differentiate us,” says Melanie Ciotti – but there’s more to it as well: “Everything we do is designed to improve the workflow of media professionals, and our customers value that attention to their needs. EVO is the leading high-performance shared storage server purpose-built for creative media teams. While our hardware is powerful and reliable enough to speak for itself, EVO is much more than shared storage. It’s a complete workflow solution with an included suite of software tools that sets us apart in the Store marketplace.”

Workflow and collaboration are at the heart of Caringo’s offer too. “At Caringo, we have been following the distributed workflow and the increasing file size trends for a few years now and have focused on enabling efficient collaboration, file protection and file delivery/distribution all from the storage layer,” says Adrian J Herrera. “We have features like web-based content management that includes the ability to easily search for, tag and share files. We have also added features like partial file restore and file clipping—all processed on the storage layer. This provides our users with a platform that provides intelligent data management as well as object storage.”

Tools of choice

“We’ve always been about openness and the ability for customers to connect to the creative tools of their choice,” says EditShare CTO, Stephen Tallamy. “There are many vendors that will offer non-media specific storage and that’s something that price doesn’t get you.”

Enabling choice of tools is important for Scale Logic too. “Scale Logic has always had an interoperability lab which differentiates our company as a value for long term investments,” says Daryl Heinis. “In this lab, we have 2PB+ of physical storage available, including HDD, SSD & NVMe running all NLEs and most common workflow tools used for ingest, asset management and archive. We constantly test our storage with these NLEs and tools to optimize performance. Not only will our customers feel good about their initial investment: they can also see Scale Logic is there for them for the long-term, as their facility makes changes into the future. Scale Logic is able to finetune value to the workflow. We also differentiate by providing 24/7 enterprise-level support for large companies—instead of calling multiple OEMs about your hardware infrastructures, you can make one call to our team and we take care of everything.”

For Nick Pearce-Tomenius at Object Matrix, it’s not about price but value: “There will always be battles on price from generic IT vendors but the broadcast technology buyer knows better than to look at $ alone; as the saying goes, buy cheap buy twice. Object Matrix has focused on providing the media industry with intelligent and cost-effective solutions that not only bring operational savings but also the ability for producers to access content from anywhere meaning they can do more. Every $1 spent on our product MatrixStore cloud brings at least double that in operational savings and enables our customers to generate more content and thus revenue.”

People matter

Alex Timbs thinks it all comes down to the people factor. “While Dell Technologies has the best, most comprehensive solutions, our most significant differentiation lies in our relationships.  While technology is always evolving, human nature is not, so it’s businesses that understand this fact and focus on building trust that have enduring relationships. Dell Technologies achieves a true partnership by understanding this universal nature, by hiring and training the right people, and through encouraging behaviours that build trust and integrity.” He also puts listening high on the list:

“When you listen, you learn things you would not if you were instead speaking, or pitching your offer without first understanding the need. This also allows our business to realise when we need to pivot to better align with what our customers need, not what we think they need.”

Relationships and customer focus are central for Spectra Logic too, according to Jeff Braunstein. “Spectra Logic firmly believes that collaboration and partnership are a great source of opportunity and improvement in business. In fact, Spectra’s business development team works closely with customers and ecosystem and channel partners alike to plan, develop and deliver the company’s broad range of digital media storage solutions for media and entertainment. We differentiate our company in the marketplace by keeping that customer focus and delivering solutions that seek to maximize our customer’s return on investment (ROI) by optimizing asset management throughout its lifecycle and providing users with the visibility into their data that they need to make intelligent decisions about storage and asset management.”

While InSync Technology is not a storage vendor, “We add value within storage solutions through video processing at the point of acquisition or the point of content usage,” says Paola Hobson. “For example, a programme maker wanting to use a piece of SD 4:3 archive content in a 1080p project will need deinterlacing, aspect ratio conversion and picture format up-conversion. InSync offers these solutions in both hardware and software so they can be integrated into any storage and archive workflow.”

Monetizing archives

With the growth of VOD, the industry saw a move away from “dead” archives to on-line storage systems that can be easily accessed for content monetization. This trend accelerated during the Covid pandemic as media companies needed to dip into their archives like never before. But are there cost implications, and if so, are they justified? Unsurprisingly, this question elicited a wide range of opinions from our correspondents.

Melanie Ciotti at Studio Network Solutions agrees that the trend has lasting value. “Keeping an easily-accessible archive of media is more useful now than ever before, particularly when new productions are stalled during the pandemic. While it may be a significant infrastructure upgrade to serve VOD audiences, it’s a product of the times. With the amount of content being produced, it’s important for studios to relaunch and sometimes retool previous projects when an opportunity arises. Creative post-production teams need tools to efficiently access their wealth of media and prevent snags in their workflow. With so much content available, media asset management software like ShareBrowser help users query and filter media libraries, saving critical time that can be reallocated to other jobs.”

Intelligent tiers

For Spectra Logic, it’s about structuring storage tiers to reflect business and operational requirements. “Spectra Logic helps users implement a modern storage lifecycle management solution that provides insight, automation and management for the storing, accessing, sharing and preserving of growing asset repositories,” says Jeff Braunstein. “This type of solution is capable of aligning the current value of the assets with the proper storage tier, enabling automatic, recurring transfer of inactive or unmanaged content from the expensive Primary Tier, made up of solid state/enterprise disk and NVMe, to the more affordable Perpetual Tier, consisting of Cloud, object storage, NAS and/or tape. Organizations can configure the Perpetual Tier to be as responsive as their workflows demand – creating copies on NAS and disaster recovery copies on cloud or tape. “Repositories like our BlackPearl Object Storage Disk, with spin-down technology that powers down bands of storage when idle, provide low cost disk that scales in capacity like tape with performance that enables digital assets to be available in seconds. At the same time, users can continue to have familiar access to all assets for as long as required. In this manner, companies are able to leverage the cost benefits of technologies traditionally used for archiving, while balancing speed of access.”

A tiered approach is also favoured by Caringo. “Keeping all assets online was a trend that started years ago but has accelerated due to the pandemic,” says Adrian J Herrera. “If you look at a cost analysis per TB of offline systems like tape and online systems like object storage, then the offline systems will inevitably be cheaper. But, when you factor in the continued ability for monetization through remote workflow enablement and immediate content distribution and delivery, then the conversation moves from a cost discussion to a profit-and-streamlined-operations discussion...it isn’t an “either-or” discussion. There should be a tiered approach to architecting an on-line storage solution that meets an organization’s specific requirements with a cold or offline tier available for cost optimization. At Caringo, we have worked strategically to integrate the ability to move data to different storage tiers in a practical way and made it a standard part of our solution. In many instances, there isn’t a need to add an additional data mover or HSM application.”

For Stephen Tallamy, CTO at EditShare, it’s about the right tiers too. “Through the use of proxy editing, it has become very cost effective to use deeper tiers of storage for cloud-based editing. The ability to restore relevant portions of media makes this even more efficient. For on-premise systems, intelligent tiers of storage allow users in one location to promote near-line assets in other locations to local, high performance systems for immediate use.”

Monetizing archives

“Dead archives are still there and we completely see the need for the archive to be monetized, which is best done by being able to search the metadata of your archive, and being able to access your content in a reasonable timeframe,” says Daryl Heinis at Scale Logic. “The most efficient and effective ones deploy MAM and spinning disks with either 100% on-prem or a mix of on-prem and cloud. We help users deal with these increased costs due to our great affordable price point for spinning disk archives, and very high-density rack mount chassis which lower power and cooling requirements and save real estate. Scale Logic has the ability to fit in 1.5PB of storage in an 8U form factor. Scale Logic also offers complete interoperability with multiple MAM partners via our global file system. COVID-19 has impacted a number of users to create an online archive, which removed their direct involvement from a human involvement standpoint. No longer did these companies have someone available to “fetch a tape;” instead, they had to relocate their archives.”

“The scenario of migrating legacy data tapes to an online or ‘active’ archive can end up being a bit of a catch-22,” says David Phillips at Cloudian. “If, in an effort to more effectively monetize archived media assets, the assets are migrated to an expensive NAS tier in order to facilitate search and retrieval, any additional monetization revenue can quickly become offset by the additional infrastructure overhead. At the multi-petabyte level, we strive to keep our cost per gigabyte less than the public cloud. From there it is easy to offer a lower TCO because there are no bandwidth or egress frees. In the current climate, with the production of theatrical and live events severely limited, you see many companies scrambling to scour their archives for content to repurpose, only to run into the reality that they have paltry metadata about the exact content of their assets. The frustrations of looking for ‘needles in the haystack’ has certainly driven many in the industry to accelerate their plans for migration to an object storage-based active archive with enriched metadata and Google-like search.”

Tape v cloud archiving

It's what you can gain from having immediate access to the archive that matters to Object Matrix. “Traditional archive media is indeed cheap to keep on a shelf but you cannot AI a tape, you cannot analyse a tape and you cannot gain instant access to the archive to monetise content on demand,” says Nick Pearce-Tomenius. “The question is less about TCO of a platform but more about the TEB (Total Economic Benefit) that a solution can bring to the organisation. Object Matrix customers like BT TV and Orange in France moved away from LTO technologies in their VOD platforms over a decade ago and have not looked back. They have saved time and effort operationally and all of their archive content is available via APIs integrated into their management systems.”

Tape lives on

Imagine Products has skin in the LTO game, and Dan Montgomery sees tape being relevant long into the future. “While cloud storage for near term use is here to stay, hybrid solutions will continue to combine low cost tapes and network storage alongside cloud options. The key is ultimately being able to index and locate the material regardless of where it resides. The lack of bandwidth access in remote and rural areas, plus cloud ingest and egress costs, will continue to drive mixed storage solutions for the foreseeable future. To this end, affordable indices and proxy sharing options via the cloud will have a place in most workflows. This will add more automation and convenience in content tracking from acquisition through archive, without the overhead of fully storing in a cloud server. People took advantage of the down time earlier this year to get organized and properly archive material. And yes, that meant more LTO tape archiving as a long term, inexpensive asset keeper.”

David Phillips at Cloudian also sees a bright future for tape. “Data tape is going continue to play a crucial role in the future as an inexpensive means of storing the increasing deluge of assets that require long-term retention. Object storage on the other hand offers an unbeatable combination of speed and searchability, especially when utilizing automated metadata enrichment. We are seeing organizations adopt a hybrid approach in order to get the best of both technologies.”

“The future is bright for cloud storage services, but it’s not yet practical for every production house to make the change,” adds Studio Network Solutions’ Melanie Ciotti. “For studios that archive and retrieve media on a daily basis, tape is still king.”

Stephen Tallamy at EditShare sees but a limited future for tape. “Tape archives have purposes for regulatory or financial purposes where access to the content is extremely infrequent. Cloud based archival can be policy driven so that it can serve as a low-cost alternative with faster, cheaper return parameters.”

Nick Pearce-Tomenius at Object Matrix too sees little use for data tape beyond deep archive. He’d choose “Cloud. Local, private or hybrid and but not public in isolation. Use public deep cloud archive as part of a multi-cloud approach, for fire and forget content or the ‘oops.. absolutely everything thing has gone wrong’ strategy. For content you need to frequently access neither LTO nor public cloud archive platforms make sense.”

Spectra Logic’s Jeff Braunstein has a more pragmatic approach: “In today’s media storage workflows there’s no longer a question of whether to use tape or cloud, but where to use tape and where to use cloud. Organizations that require high access to digital assets over time should consider aspects such as the high cost of cloud egress fees and connectivity bandwidth; in such cases, maintaining additional on-premise copies of data is often the best solution. And if talking about large volumes of data, that means tape. With a modern approach to storage lifecycle management, organizations can ensure digital assets are located on the right place at the right time, be it tape or cloud, delivering affordable long-term protection and access to content while helping organizations to become more effective by efficiently using new technologies at what they are best.”

Extracting value

“Archiving to private object storage is the best bet,” says Dell Technologies’ Alex Timbs. “TCO is less than public Cloud, access times are faster, and there is no LTO migration to worry about.  It gives organisations full access to their archives without having to worry about incurring extra fees for a busy time period. Anything on tape is essentially stagnant content, with no ability to abstract value. However, the cost to store it is relatively inexpensive. Depending what tier you are using in the Cloud, the same issues exist. However, if it's sitting on an active tier in the Cloud or on-premise, it does allow you to extract more value, offsetting the extra cost to store the content in an active manner. The other benefit of Cloud is currently there are readily available services you can run against the content that is there. However, we are seeing more customers look to extract this value locally on-premise, with reasons ranging from the cost of data movement to data gravity.”

Is cost hindering cloud adoption?

We have seen the adoption of Cloud focusing on collaborative workflows but is the cost of moving content in and out of the Cloud hindering adoption? Or will on-premise storage continue to be a better economic and/or operational proposition in some applications? This question also provoked a range of opinions.

“Moving data is still a challenging, expensive proposition,” says Dell Technologies’ Alex Timbs. “This is unlikely to significantly change in the near future. In fact, this is an issue that will likely get worse before it gets better as data volumes are increasing much faster than the evolution of new technologies to move that data around. There are two insights that are driving how organisations and workflows adapt to this reality.  First: data only gets moved when value is added by moving it. This requires careful examination of workflow steps and a deep understanding of data sets and their worth. Second, moving applications to be close to data is cheaper/easier/better than going the other way around. An understanding of those points should guide any adoption of any remote collaborative technologies.”

A matter of scale

“The true cost of the cloud depends on utilization; that is, if it costs them money then it will cost you money,” says Caringo’s Adrian J Herrera. “From a business-model perspective, these costs are recurring in perpetuity. What this means is that you will always pay for what is used. With this in mind, you need to understand where the true value of cloud is for you. From a compute perspective, it is usually the ability to enable a burst of infrastructure, processing power or bandwidth in a way that can be scaled down when not needed. The value for storage, however, is a bit different. You rarely scale your storage needs down. What we see more often than not is, from a storage perspective, once you hit the 100 TB+ threshold and you are able to predict your growth needs, it is always more cost effective to keep content on-prem in a way that is still accessible by native-cloud workflows.”

Hybrid strategies

“I think you are going to see the increasing adoption of a hybrid cloud strategy,” says David Phillips at Cloudian. “There is no question that storing large hi-res assets in the public cloud is much more expensive over the long term than storing on-prem, and the cloud pricing trends don’t seem to point to that changing anytime soon. There is also no question that the public cloud is a great place to host proxy-based collaboration workflows, which is why you see so many MAM vendors adopting S3 storage as a primary asset repository. Once the edit is locked and approved, the EDL can link to the hi-res assets living in on-prem S3 storage for final output.”

EditShare also sees the cloud as the ultimate destination, but for now keeping hi-res media on-prem is the answer. “It’s beneficial to work in the most efficient codecs in the cloud when the need to egress is required in order to minimize costs,” says Stephen Tallamy. “However, there are very efficient proxy based workflows that allow you to keep your original materials on-premise but provide access to editors around the globe - this allows collaboration without incurring the costs to take your final product to the customer. Ultimately, we are heading to a future where the content arrives in the cloud and doesn’t egress until it’s being delivered to a consumer. In the meantime, hybrid workflows that allow the high resolution and proxy forms of media to sit in different storage locations allows creatives to build a flexible workflow at the right price point.”

Object Matrix’s Nick Pearce-Tomenius sees cost as the key issue, and points to a hybrid future. “Public cloud storage companies make their money on data services and customers retrieving the content they own. Some might call that the ultimate in ransomware. Not me though. The immediate future for some will be cloud first if their business models can support the cost and performance profile those platforms bring. Others are looking at a hybrid approach utilizing the power of on-prem workflows with the elasticity and data services that public cloud brings. We are also seeing a surge in interest in private cloud storage or managed services like MatrixStore Cloud that offer the same level of commercials as the public cloud providers but with no egress fees, predictable long terms financials and the ability to access all content from anywhere without penalties.”

Spectra Logic also envisages a hybrid outlook. “For the foreseeable future, hybrid storage solutions will enable organizations to reduce data storage costs while optimizing data protection by storing some data in the cloud and some using on-premise disk and tape solutions,” says Jeff Braunstein. “When implemented with a modern approach to storage lifecycle management, hybrid cloud solutions effectively deliver affordable long-term protection and access for data by ensuring data is located in the right place at the right time throughout its lifecycle. With modern storage lifecycle management in place, users can reap the benefits of cloud and on-premise storage technologies, balancing cost and access and adapting to changing workflow needs as technology evolves.”

Storage Network Solutions agrees. “Who says it has to be one or the other? We have many users that enjoy a hybrid on-prem and cloud storage workflow,” says Melanie Ciotti. “The problem with cloud-only storage workflows is that it isn’t built for the massive footage transfers common in creative media. It isn’t ready for the collaboration and speed that media workflows demand. Cloud storage can be an integrated part of a user’s shared storage workflow—serving an important role in backup and replication, file sharing with clients, etc.—but I believe on-premise shared storage will continue to offer a better solution to the media production community for the foreseeable future.

“The cost of collaborative workflows in the cloud is hindering adoption and will continue to do so,” says Daryl Heinis at Scale Logic. “In-and-out fees are the main concern, and tracking the costs is also an issue. To effectively contain these two main concerns, companies must adopt a proxy workflow or a complete 100% cloud solution that aims to limit the back-and-forth of data movement. Scale Logic continues to look at the value of all our solutions—including our Remote Access Portal (RAP) economical shared storage and archive solutions—to continue to pay for themselves within three years’ time and, in some cases, a lot sooner. Investment and ROI continually plays into our product development. No longer does shared storage have to be in the same building as you.”

How is cloud changing store workflows?

For Object Matrix, the answer is simple. “The cloud, be that local, hybrid or private, is enabling content to be liberated in many ways from single workflow silos or ‘content jails’ as some call them. If content is in a shared cloud storage bucket and available on the network via APIs or standard protocols then more can be done with it by internal or external teams,” says Nick Pearce-Tomenius.

Caringo’s Adrian J Herrera agrees. “The cloud, or specifically the ability to support cloud-based workflows, is becoming a checkbox item for storage workflows. What this often means from the storage perspective is support for cloud APIs and interfaces like Azure or the de facto standard Amazon S3 API.”

No way back

“Cloud-based workflows have introduced production teams to the experience of ‘accessing all our assets from anywhere’ and once they get a taste of that, it is difficult to get them to go back to legacy workflows based on LAN shared storage,” Cloudian’s David Phillips adds. “I think the other big change is using object storage clouds to extend the capacity of Tier 1 SAN and NAS volumes and thereby “right-size” storage allocations according to performance and capacity needs.”

EditShare’s Stephen Tallamy welcomes the speed and responsiveness of cloud workflows. “Full systems can be set up at a very rapid pace. This immediate access to unlimited storage capacity reduces the overhead of trying to plan your storage capacity up-front. If you need more storage, you can have it within minutes. If you need more throughput for different codecs or more editors, again you can scale out to meet this need, then contract when finished.”

“Cloud will eventually change Store workflows for everyone but as of today, it is not an economically viable option for the average user, as transcoding fees can get prohibitively expensive,” says Daryl Heinis at Scale Logic. “If facilities are looking to use the cloud just because accessing content is impossible during the pandemic lockdown, Scale Logic offers a Remote Access Portal that can allow remote edit clients to access on-prem storage content very securely over HTTPS. RAP will then seamlessly sync the edited content back to the on-prem storage.”

Jeff Braunstein of Spectra Logic again emphasises the importance of the right content on the right storage: “Cloud has become a ubiquitous part of many media storage workflows, but leveraging it to its fullest value requires a modern approach to storage lifecycle management that brings users the visibility and insight into storage they need to better manage digital assets by enabling intelligent tiering and migration, while maintaining transparent search and seamless access to migrated assets.”

Melanie Ciotti at Storage Network Solutions has seen cloud workflows blossom during lockdown. “We’re seeing many users opt to backup project files, exported media, and more to cloud storage. This has been an incredibly useful workflow for many creators working from home who don’t have remote access to their on-prem storage server. By automating the file transfer and backup process with Slingshot (EVO’s built-in automations GUI and API), teams can spend less time manually transferring their media to cloud storage, and more time creating and editing it. Of course, there are many teams that still need remote access to their on-prem storage. Our newest service, SNS Cloud VPN, gives users a secure connection to their EVO shared storage from wherever they need to be.”

Final word in this section goes to Alex Timbs at Dell Technologies, who has some searching questions to ask – and thinks anyone thinking of moving to the cloud should answer for themselves. “While most M&E businesses need to embrace cloud on some level for aspects of their pipeline, I don't think the value proposition has changed that much in an M&E context in the last ten years; it's just more accessible, has a richer ecosystem, and is more competitive than it was. Customers still need to ensure they answer the ‘Why’ before committing to a strategy.

“The cloud has immense value, but it isn't the answer to everything, particularly  in media workflows. I raise this, as I have seen so many businesses fail to ask and answer the ‘Why’ in relation to their cloud strategy, resulting in overwhelming complexity, slower pipelines and bill shock. When I refer to answering the ‘Why’, I mean customers need to answer a few fundamental questions, some examples I might want to answer for myself are below:

  • The Cloud isn't cheaper for sustained activity in almost every use case, but it's excellent for peaks, so where will you use it in your workflow to best take advantage of this?
  • Cloud often increases complexity, particularly in hybrid workflows, so do I have enough subject matter experts and clarity of outcomes?
  • You need to get a lot more prescriptive with your data, a) because every GB per minute costs you, and b) because you need to ensure its where the compute, user or process is. So you need to know where your data is currently, and where does it need to be throughout your workflow?
  • How will you manage cost controls and who will approve them?
  • How do you intend to migrate your technical debt (if you have any), and have you factored that into your budget?
  • Can your intended workflow be containerised, and will it be more efficient in doing so?
  • Will you have the resources you expect when you need them? As an example, some large media customers may find they need to use all access zones in order to secure enough resource at the right price, and need to move data and /or cache across them, with massive associated financial and time costs due to data movement.
  • Beyond being able to address short-term resource demands, how will Cloud improve speed, number of iterations, or reduce costs?
  • In most businesses, innovation comes from testing a new idea or questioning the status quo. This generally requires a little risk-taking to prove out. When using cloud resources, every tested idea comes at a cost, so the risk is that innovation is stemmed, due to cost aversion.”

What's coming next in content storage technology and workflows?

We asked our correspondents what they are working on right now to further enhance storage technology and workflows over the coming months and years.

“For content storage specifically, we will continue to increase the functionality of what can be done on the storage layer. We already have functionality like partial file restore, video clipping, file sharing and file searching built into the storage layer,” Caringo’s Adrian J Herrera reveals.

Cloudian’s David Phillips maps out ambitious plans: “Our initial product focus from our founding in 2011 was on delivering scale-out distributed storage clusters, accessed via the S3 API. In the past couple of years, we have really focused on security, as you can’t offer storage to governmental agencies much less media enterprises without very rigorous tech stack auditing and security certification. This also led to the introduction of our S3 Object Lock feature for ransomware protection. Now, with the increasing commoditization of NVMe technologies, we are seeing a demand for not just higher performance storage but high-performance storage that is API-addressable and can scale out to serve enormous data sets. To address this demand, Cloudian recently introduced flash-optimized object storage software that also provides 3X better price/performance than competitive offerings. In a nutshell, our aim is to offer highly-scalable storage that is performant, secure, and accessible everywhere.”

“At EditShare we are planning on more and more 3rd party/partner integrations to allow for better customization of the larger ecosystems,” says Stephen Tallamy. “We recognize that storage is just one piece of the puzzle and that by providing open APIs customers can buy only the software they need in order to unlock their creative resources. On-premise, across multiple locations, and in the cloud - media needs to be available intelligently and leverage AI to maximize its use.”

InSync’s Paola Hobson is looking to ensure that its customers can readily access every piece of content. “It's clear that broadcasters and media companies need to have well-stocked archives to be ready for any situation where production of new material gets interrupted (e.g. as we have seen recently with the covid-19 pandemic).  Archive content also helps fill schedules so helps cut costs. It's important for content owners to be confident that they can re-use their stored assets, so frame rate and format conversion solutions, such as those offered by InSync Technology, will continue to be critical tools in monetisation of content.  When 8K production in HDR becomes the norm and content producers are already experimenting with the next new trend, content owners will continue to happily use their conversion tools (from InSync Technology of course!), confident that they can monetise their material long into the future.”

Daryl Heinis at Scale Logic: “One of our main aims is to integrate seamlessly and natively to S3 compliant cloud. We are also actively progressing client and server-side performance with RoCE RDMA technologies, as our clients request zippier metadata performance and higher top end throughput. Single global file systems above 20GB/sec are also on the roadmap, as well as actively researching and planning into 200 and 400Gb ethernet.” 

“Most organizations are now generating and utilizing content in both in multiple clouds and at multiple on-premise locations,” says Spectra Logic’s Jeff Braunstein. “These organizations are also moving content between these cloud and on-premise locations. Data management solutions must account for these hybrid workflows. Spectra Logic continues to focus on tighter integration and management of content in these locations.”

“It’s no secret that many studios are eyeing further implementation of remote workflows into their business,” says Studio Network Solutions’ Melanie Ciotti. “In 2020, we’ve addressed this need for remote connectivity to on-prem storage and file portability with our proxy workflow and SNS VPN solutions. In 2021, we will continue to optimize the remote workflow solutions our customers rely on to keep creating from anywhere.

“Outside of remote workflows, we’re updating our EVO GUI with a brand-new EVO OS v.7, bringing new features to our MAM system with ShareBrowser v.6.0.1, and keeping our engineers and product specialists busy with exciting new ideas in development. SNS has doubled down on what we already knew: the media production industry is not homogeneous. We’re committed to creating a storage environment that’s as flexible as the industry we serve, giving content creators the choice to make their projects in a way that works best for them,” Ciotti concludes.

For Dell Technologies, it’s “Highly performant, small footprint ‘edge’ storage with optimised data orchestration built-in.  Companies are centralising IT and archive resources, but for the foreseeable future, there is still demand for fast storage in multiple locations near to where talent is working.  This precisely aligns with the PowerScale roadmap and Dell Technologies vision,” Alex Timbs concludes.

Final word goes to Nick Pearce-Tomenius at Object Matrix. “It’s all well and good categorising storage as “on prem” or “cloud”, but in the end, these are layers of abstraction that interest IT nerds and not the users of those systems. The users just want their data available quickly and securely wherever they happen to be working from. So from a content storage technology perspective it’s about providing data for people where they want it and when they want it. Object Matrix is uniquely positioned in the media industry to provide on-prem through to cloud solutions for content storage and is building upon that technology base to provide a true cloud storage - one that spans all paradigms of storage in a secure and manageable freeway.”

Sony NMS: Successful Digital Supply Chains Scale as You Grow

Kunal Shah 

VP of software engineering and architecture

When it comes to a media organization’s digital supply chain, scalability is one of the keys to success, according to Kunal Shah, VP of software engineering and architecture at Sony New Media Solutions.

A successful digital supply chain will not only meet demands, but scale as you grow — keeping pace with increased distribution volumes, delivery deadlines and satisfying the requirements of all the different streaming platforms, all while ensuring content is securely stored and readily accessible, as the Sony division learned first-hand during its own journey to the cloud.

Sony has “been able to massively scale our digital supply chain by leveraging cloud technologies and meeting the clients’ needs,” Shah said during a May 27 presentation at the Hollywood Innovation and Transformation Summit (HITS) Live event.

There are various challenges and opportunities that media companies face when it comes to their supply chains, and “the first challenge that we see is [the] ever- evolving landscape,” he said during the presentation “Turbo Charge Your Digital Supply Chain to Scale.”

During the session, he explained how the Sony division transformed its platform using cloud technologies, allowing it to quickly adapt to evolving market demands, and also discussed what the company discovered throughout its journey and the path forward.

Media organizations are experiencing “tremendous growth in… end users,” according to Shah, adding that, as a result, “there is more demand for custom experiences.”

What is “very critical” for the core supply chain business and clients is “how we can be efficiently operating” and, at the same time, be scalable, he said. Cost control is “definitely something we need to keep in mind,” he noted.

Another major challenge facing the industry, “across the board,” is the fact that “we have different standards” – or a lack of standards, he said.

It made complete sense for the Sony division to move to the cloud, he said, referring to it as a “natural progression.” One “key factor [was] our ability to monitor and do some analytics,” he said, pointing to the significance of machine learning also. However, the ability to gain “dynamic scaling was one of the biggest reasons” to shift to the cloud, he told viewers.

While making that journey to the cloud, Sony “made a conscious decision that we will go slow and we will try to think fast because we want to stay ahead of the curve,” he told viewers, explaining it was important to “evaluate our designs and make meaningful progress.” That was “why the path we chose is we started with the lift and shift and we completed our entire migration,” he said, adding the company then entered the “phase of redesign/refactor, where we refactored a bunch of legacy code as well as started… some of our services and then finally we are in this phase where we are starting to be cloud native.”

Meanwhile, “one of the things that we firmly believe in terms of scalability” is that being cloud native “can not be a goal – it has to be like a journey because you need to constantly evaluate the needs and the demands,” he said.

He went on to tell viewers: “At this point, we have been in [the] cloud for about three years and we have been able to massively scale and successfully meet all our demands, as well as requests for our clients and, at the same time, we have provided some recommendations which would help everyone in the supply chain.”

However, noting how short the presentation was, he conceded he was “barely scratching the surface” on this subject.

Click here for the presentation slide deck.

The May 27 HITS Live event tackled the quickly shifting IT needs of studios, networks and media service providers, along with how M&E vendors are stepping up to meet those needs. The all-live, virtual, global conference allowed for real-time Q&A, one-on-one chats with other attendees, and more.

HITS Live was presented by Microsoft Azure, with sponsorship by RSG Media, Signiant, Tape Ark, Whip Media Group, Zendesk, Eluvio, Sony, Avanade, 5th Kind, Tamr, EIDR and the Trusted Partner Network (TPN). The event is produced by the Media & Entertainment Services Alliance (MESA) and the Hollywood IT Society (HITS), in association with the Content Delivery & Security Association (CDSA) and the Smart Content Council.

Red Bee Media: Strength under pressure

A new global OTT managed service launched on time under remote working restrictions validates the strength of Red Bee Media’s relationship with client TV5MONDE, the resilience of its team and the flexibility of the platform’s architecture.

Red Bee Media, one of the industry’s leading managed services providers, was instrumental in the launch of TV5MONDEplus, the new global video-on-demand platform serving French language series, films, documentaries and children’s content in full HD quality to 193 countries. A project of this scale and profile would be worthy of note under any circumstance but is even more remarkable given it was successfully launched on deadline during this most challenging of years.

Extending the reach of French language programming to audiences across the globe is a central mission of TV5MONDE. The global TV network, which is supported by the governments of France, Canada, Switzerland, Wallonia-Brussels and Quebec, conceived of a new streaming service to deliver high quality and free French speaking content to users worldwide.

Specifically, TV5MONDEplus is intended to promote French-speaking content and the programs of its partner channels, TV5MONDE’s own productions, as well as co-productions and programs acquired all over the world over multiple device platforms.

The project was greenlit in the autumn of 2019, assigned a launch date of 09 September 2020, and put out to public tender.

Strength in depth

Red Bee Media’s relationship with TV5MONDE goes back several years stemming from the integration of the broadcaster’s facility to today, providing staff to operate and engineer its playout, production and post facility. Red Bee Media has also concluded several projects for the broadcaster.

“We have a long term and strong relationship with TV5MONDE but that was no guarantee that they would select us for their streaming service,” says Cong Thanh Nguyen, Key Account Manager for TV5MONDE at Red Bee. “We applied to the RFP as we would any other public tender process and with four competitors were invited to proceed to the next stage.  It was a strict process which included making several demonstrations of our proposed solution.”

TV5MONDE stipulated several criteria in the RFP. A key one was to be fast to market. “They had a strict deadline and they didn’t want to spend 18 months or more to get the service up and running from scratch. They were looking for a managed service provider.”

Secondly, TV5MONDE did not want a phased deployment. They required availability of the service in every region from day one.  “One of their essential requirements was visibility into the management of rights to assets published by region or by country,” says Nguyen.

In addition, the platform needed to be available on multiple devices including the web and set top boxes and with a strong focus on mobile to cater for audiences in territories where mobile is a major or primary means of streaming video.

Red Bee Media won the business at the beginning of 2020, and immediately set to work.

“They were looking for a partner that could provide a managed service end to end including back end and front-end apps and to meet several device types,” says Olivier Braun, Technical Product Manager for OTT at Red Bee. “Red Bee’s managed OTT platform has been developing over five years and contains several cutting-edge technologies. One of its core features is that it is very modular meaning we can tailor solutions for each client from multiple building blocks.”

The building blocks of successful OTT

One of those building blocks is an entitlement engine which is able to manage the rights for each individual TV5MONDEplus asset per region and per country.

Adherence to the highest standards of content security, protection and digital rights management (DRM) was also mandatory. All TV5MONDEplus assets are protected according to content owner rights. Red Bee Media’s solution defines both a MPEG-DASH and HLS source with Google Widevine, Microsoft Playready and Apple Fairplay DRM technologies.

TV5MONDEplus is a free to view advertiser-funded model which required integration of an ad serving solution. At RFP stage it wasn’t decided if this were to be server-side side or client side. In the event, they chose a client-side solution with partner France Télévisions Publicité as provider of inventory using FreeWheel (a Comcast company) for video ads and Google for banner and in-app ads. The back-end integration into Red Bee’s platform was straightforward.

The service was to allow users to pause programming on one platform and continue seamlessly on another device. This required Red Bee to build in authentication and cross device synchronisation of content to the user’s account. Localisation is enabled with subtitles of French language content available in French, Spanish, English, German and Arabic. The user interface is also localised in five languages.

An absolutely vital ingredient for TV5MONDE was video quality which had to remain consistently broadcast standard even as bandwidth and devices varied country to country.  Red Bee used per title encoding to analyse the complexity of individual assets for TV5MONDE in order to optimise the bitrate ladder and produce the best quality to bitrate ratio.

“The quality of video is extremely important to the client,” says Braun. “When we ingest content we use Adaptive Bitrate streaming (ABR) to provide multiple bitrates which the video player can adapt based on the best available bandwidth. All assets are available in full HD.”

On the distribution side, the service taps into the network of TV5MONDE’s existing partner, Akamai. Since Red Bee Media’s platform is CDN agnostic the integration of the service with Akamai’s CDN presented no problems.

The end-user applications for the platform were built by Dotscreen using Red Bee’s standard Software Development Kit (SDK – available in iOS, Android, Smart TVs and Javascript), which allows for seamless connections with Red Bee’s services for content display, playback, entitlement, analytics, security and streaming.

Deadline delivery under Covid conditions

The delivery date was immovable – global pandemic or not. Despite a six-week period from March when virtually the whole of Europe was in lockdown, Red Bee’s team kept the project on track.

“There’s no doubt that Covid-19 made everything a lot more challenging but we were fortunate in already operating a fairly dispersed team,” says Braun. “We have key team members in France, the UK, Sweden and Romania so we are used to collaborating on projects remotely.

“The most severe period of lockdown happened during the specification phase of the project when we were nailing down issues around the front-end UX and the app’s look and feel. Those sorts of conversations are much easier when everyone’s in the same room but on all sides we pulled together and just got on with it.”

Another advantage to maintaining business continuity is that Red Bee’s platform is entirely cloud-based with no need for local deployment in Paris.

TV5MONDEplus continues to evolve both technically and editorially. New features are planned by the end of the year, to improve the user experience on the web as well as on the IOS and Google applications. Distribution will be extended with the TV5MONDEplus application being made available on LG and Samsung connected TVs, then in 2021 on some American cable operators via Adobe Prime.

Hélène Zemmour Digital Director, TV5MONDE says “Press and user feedback since launch has been very positive. Our Francophone and Francophile audiences appreciate this rich and free offer, in French and subtitled in 5 languages. The catalog of more than 5,000 hours of cinema, series, documentaries, magazines and youth programs has enabled Internet and mobile users from all over the world to discover the diversity of French-speaking creation in Quebec, Canada, Switzerland, Belgium, French or Africa.”

Zemmour adds, “With Red Bee Media’s OTT-platform, we got access to first-class streaming and broadcasting expertise, as well as crucial features such as advanced ad tech and geo-blocking functionality. This, in combination with Dotscreen’s design expertise, allows us to offer a high-end user-experience comparable to the biggest streaming services available. We are looking forward to continuing this cooperation, developing TVMONDEplus for the benefit of global audiences.”

Stéphane Grandvarlet, Head of Market Area Southern Europe and Managing Director Red Bee France adds: “We are very proud and excited to have been a part of this unique launch for TV5MONDE. By delivering a competitive global streaming service, in a very short period of time and in less than ideal circumstances, we shave showed the strength of our OTT platform and our team.”

TV5MONDEplus is available on www.tv5mondeplus.com as well as on free apps on Apple Appstore and Google Play

Making AI work for M&E

Muralidhar Sridhar 

Vice President, AI/ML Products

PRIME FOCUS TECHNOLOGIES

The initial euphoria over AL/ML seems to have died down in the M&E industry. Our research shows that many M&E players have run AI initiatives with different vendors but have not achieved anything substantial enough to solve their business problems. Though the demo was impressive, the project hit a wall at the Proof of Concept (PoC) stage because the AI solution did not work for their content! When the cycle was repeated with multiple vendors, they concluded that AI models are not available or mature enough to solve specific M&E business challenges.

The fact is, no M&E client’s business problems in their entirety can be solved effectively by any one AI engine or solution provider in the market. Also, the heavy lifting involving trial and evaluation of multiple vendors rests on the client’s overburdened shoulders. But do they have the required data science talent in-house to tweak the AI/ML engine for their enterprise’s data model and ensure accuracy and actionability? The lack of adequate expertise also scuttles any in-house project which attempts to build an AI/ML model.

To crack the impasse, what is needed is a media recognition AI/ML platform that brings the best-of-breed AI models and home-grown niche models to address the issues of accuracy and actionability. Plus, to tackle the talent gaps, consulting expertise in deep learning AI with computer vision knowledge is critical. For AI/ML to deliver for M&E organizations, no cookie-cutter approach will work; what is required is a tailored, bespoke model that embraces the unique data nuances of the client enterprise.

At Prime Focus Technologies (PFT), we believe that if AI is to work for M&E players, then it has to deliver accurate and actionable data which can solve unique business challenges. For this to happen, solution providers have to be open and committed to work with any AI/ML engine, have the data science talent pool to interpret the data and its subtle nuances, and tweak it to suit the needs of the enterprise’s content.

While the expectation from AI right now is to solve for accuracy alone, PFT has gone a step further and tried to solve for actionability as well. We offer a combination of Technology and Consulting to deliver accurate and actionable data that can solve specific M&E challenges seamlessly. This is how we make AI work for you!

PFT’s native media recognition AI platform CLEAR Vision Cloud helps solve real world business problems of TV Networks, Studios and OTT platforms because of its perfect combination of both technology and consulting. It integrates best-of-breed AI engines like Microsoft, Google, Amazon Web Services, IBM Watson, and home-grown AI models along with a unique Machine Wisdom layer that is focused on harnessing the best quality data. Along with Technology, PFT’s bespoke strategic Consulting services ensure AI works for the customer, taking into consideration their specific business challenges and unique content.

CLEAR Vision Cloud offers AI data at 3 levels (Basic, Advanced, Premium), along with additional data packs including Compliance, Comparison and Transcript. It also provides a set of powerful Action Toolkits that makes the AI harvested data actionable. Action Toolkits, as the name suggests, are ready-to-use to address specific M&E use cases. The Action Toolkits include – Discovery, Segmentation, Video Comparator, Content Moderator and Language Tools. These are enveloped by PFT’s unique Machine Wisdom layer that imparts cognitive benefits to CLEAR Vision Cloud. Think of it as an AI platform with a human brain!

CLEAR Vision Cloud AI Data Pack for Basic Metadata helps identify physical video segments (blacks, color bars, slates, pre-caps, re-caps, montages, essence); text & textless segments; specific captioned segments; and custom segments based on customer need across long form and short form content with 100% accuracy & 100% frame accuracy.

The Segmentation Action toolkit allows review & QC automatically identified segments; filter out content segments & export EDLs; and generate a video of the custom segment by stripping out the rest of the physical segments. Substantial reduction of time and cost of

segment marking, and content segment extraction is achieved. Automatic learning of segment signatures based on QC input is an industry first and a key enabler in workflow automation. While we have built a basic toolkit for segmentation, there are many requirements that can be addressed and many use cases one can think of. We believe AI has the capability to address all of these effectively. PFT’s AI model is home grown and customized to solve specific M&E use cases to make AI work for you.

For example, a leading TV Station Group in the US was manually identifying and marking segments like color bars, blacks, slates, and content segments in the ads they receive for broadcasting in their network. This dependency on operators was time-consuming and adding workforce costs to them. There was always a possibility of error due to manual intervention.

Though there are a few technology solutions available, they have limited capabilities and cannot handle the vagaries and noise in the content. Solutions that identify blacks using an image or signal processing cannot tolerate black and color bar variations from various sources. Detection of more sophisticated segments like slates needs deeper cognition and interpretation of the content, missing in the available tools.

The Station Group needed a solution that could precisely, and frame accurately identify these segments every time to ensure complete automation of the workflow. PFT provided its powerful AI platform, Vision Cloud combined with Consulting Services to address these challenges. Vision Cloud is a sophisticated AI platform that recognizes these segments precisely as well as frame accurately in the ad spots. Vision Cloud employs several computer vision and deep learning technologies trained with huge amount of past data to natively develop this solution. Machine Wisdom, PFT’s patented technology is used to make sense of the cognition delivered by underlying AI & ML models. As a result, Vision Cloud can identify and detect color bars, slates, and black screens accurately, even if they are from different sources and come with variations and noise. After performing a QC, the spots are sent for playout.

PFT’s Vision Cloud has brought the power of AI to the customer and ushered in complete automation of the workflow with no manual intervention.  The effort and cost saved as a result of this automation can be invested towards creative pursuits aimed at making the content more compelling and engaging.

COVID success story: PlayBox Technology

Phillip Neighbour 

COO

2020 has been like no other – the COVID 19 pandemic has caused unprecedented change for consumers, businesses, and communities alike on a global scale. Despite lockdown measures causing many to remain within their homes, prompting an increase in free time and rising media and TV intake, the broadcast media industry still felt the effects of the pandemic. Advertising revenues declined as consumers tightened their belts, streaming consumption increased as production efforts and live sporting events grinded to a halt and remote working became the new standard for our industry.

PlayBox adapted to these changes quickly and effectively, calling upon tried and tested remote working mechanisms to ensure that our team can work safely at home.

For our customers, we recognised their immediate needs for seamless remote transitions. We introduced centralized tools across both the Neo platform and Cosmos for remote configuration, management and monitoring of video links between multiple locations. We also diversified the output options available to users, with SRT as an available replacement for RTMP. Looking to the future, our development team has been working hard by improving overall codec support for 8K media within our cloud-based Cosmos software. To provide further support for those hosting the new age of virtual events, we enhanced the SocialMediaBox plugin for TitleBox, which provides accurate filtering by utilising a semi-automatic moderator engine - reducing the time needed to moderate featured posts on live events.

As a result of customer and industry feedback, we decided that we wanted to rethink our approach to Ecommerce. Amongst the playout automation and broadcast solutions market, it’s too often cumbersome and increasingly difficult to get a quick, easy quotation for your project. Furthermore, as a result of COVID-19, we could not visit customers and finalise deals in-person. We knew that we wanted to streamline that process - which led us to create the PlayBox Marketplace. The Marketplace allows our customers to purchase software licenses, full turnkey systems and renew their annual maintenance packages easily via our website.

Our global presence and flexible structure as a company has ensured that the remote working transition has been a smooth one. Our dedicated support team have been delivering exemplary service throughout, hosting free online demonstrations and providing timely 24/7 assistance to our customers.

As Covid-19 lockdowns expanded this year, PlayBox Technology saw both software only and turnkey solution sales of its PlayBox and Cosmos series rise along with the industry interest in remote broadcasting.

The company is proud to announce a 56% increase in sales compared to the same period of the previous year. The results were fueled by winning an international tender and by the record revenue growth from the company’s entry level playout solution, PlayBox Neo, which grew 131% YOY, from its portfolio advanced playout solution, PlayBox Mega, as well as a continued growth of PlayBox SaaS solution, which grew 54% YOY.

Approximately 98% of PlayBox’s new business revenue was generated outside of the UK. Both the North American and Far East market constitute PlayBox Technology’s largest market, with 78% YOY growth, generating more than one-third of PlayBox’s revenue in 2020. Additionally, revenue in the Middle East grew 56% YOY.

“We are very excited and thrilled about our achievements in 2020, including acquiring new customers beyond our estimates,” says Phillip Neighbour, PlayBox Technology’s COO. “The team also nearly doubled in size in one year, which has allowed us to focus on our cloud solutions, as well as innovation and product development.”

August saw us premiere the very first episode of our brand new podcast - In the Hub - onto all major streaming platforms. It’s our way of speaking to both up-and-coming entrepreneurs and legendary figures within the broadcast and media industries, discussing their stories and experiences and hearing their predictions for the future of broadcasting. We can’t wait to see how our new platform excels and grows into 2021 and beyond.

“It’s been an exciting and incredibly insightful process,” says Neil Thacker, Go to Market Manager. “It’s an honour to be speaking to some legendary figures within the industry and enhancing our digital content offering through our weekly podcast.”

We also launched the Knowledge Base - our platform for FAQs, articles and guides that customers can access directly through our website. It’s a constantly evolving support hub, and we’re always looking for additional content to add. For the customers who want to know more about our solutions, it’s all in the Knowledge Base. PlayBox users can also share ideas and feedback within the Community. We also took time to create some helpful video product tutorials for our solutions, which can be found on our website.

With trade shows and conferences cancelled and postponed, we wondered what the future of large-scale, in-person events would look like. In May, we opened our virtual event stand - highlighting our products, learning materials and hosting interactive product demonstrations, also published on the company website. We saw some incredible traction from new and existing customers alike, and we have built some exciting new partnerships and relationships as we venture into a new age for playout. As much as we tried, it still doesn’t beat seeing our customers and partners in a face-to-face capacity and sharing a drink (or two).

Despite 2020 not playing out how any of us expected it to, we’re confident that the industry as a whole will emerge stronger for it. The team at PlayBox Technology UK have come to welcome the pressure that such unprecedented times can create. We’re innovating at every step of the way and we aren’t afraid of change - and our worldwide customer base with over 20,000+ playout systems deployed are reaping the benefits of our flexibility. We can’t wait to show you what our development teams have been working on - further enhancements, initiatives and an entirely new product line. Exciting times ahead!

Bringing the highest production standards to Greenland

Despite being physically one of the largest countries in the world, Greenland has a small, and typically isolated, population. Despite this, it supports a national broadcaster, Kalaallit Nunaata Radioa (KNR). The broadcaster is government funded but remains decisively independent.

It broadcasts largely in Kalaallisut, the Greenland native language, along with Danish (Greenland is a self-governing part of Denmark), and there are ambitions to add English as well. Remarkably, KNR creates 1000 hours of unique television each year, along with 2600 hours of radio – no mean achievement for an organisation with only around 85 employees.

Alongside radio and television, KNR has a strong online presence. All this adds up to the need for comprehensive news production management, ensuring that all three portals are fed with timely and accurate stories, in at least two languages.

KNR had a content management system but it was very much under strain, demanding a lot of technical resources simply to keep it going. It was time to look for something that would deliver productivity but could be implemented simply and quickly. “We were in a crisis,” said Kim Larsen, head of TV production at KNR. “Everything was too old – but we did not have the money to buy a traditional system.”

“We are very keen on open-source technology,” said Martin Chemnitz, new media manager at KNR. “Proprietary software locks you in, limiting you in what you can develop. Open source allows you to customise on a larger scale.”

KNR’s requirements were slightly paradoxical. On the one hand, they wanted a single system for radio, television and the web, avoiding silos and gaining benefits from working together. On the other hand, by the standards of many vendors this was a small system, because KNR is a small – albeit very efficient – organisation.

Kim and his team developed an action plan, which was discussed first with the operational team and then with the board. Armed with a set of requirements, they looked for what was possible. A Danish technology consultant suggested they look at nxtedition.

Ola Malmgren, creator of nxtedition explained “when we founded nxtedition, we brought together experienced broadcasters with advanced thinkers in software and IT. Our goal was to create a solution which was smart, innovative, productive and future resistant.”

Central to the design concept was that the software should be open source, microservices-based for inherent virtualisation, and controlled through web services for simple customisation and development. On this platform, the company has built a complete set of tools for journalists, producers and engineers, covering every aspect of production management from origination to playout.

“It was hard to believe at first that it could do all this,” Larsen said. “But boy was it a good idea.”

“Their approach is radically different,” he continued. “They asked, ‘what do you want to do?’ and ‘how do you want to do it?’. They asked us to think differently, to see what could be automated.” They worked together to develop an easy way of working.

An important issue was the fact that, while the production management system was largely life-expired and in need of replacement, the web content management system (based on Drupal) was meeting KNR’s requirements and changing it would be a resource demand they could happily live without.

“When you create a website based on a content management system, the software sets goals,” according to Chemnitz. “We quickly found that we are able to customise nxtedition to get what we want, and Drupal and nxtedition readily scale alongside each other.”

Larsen added “it made a paradigm shift in producing television and radio, which has come from the web. nxtedition puts the user at the centre and allows each member of the team to work intuitively. When you have journalists and producers excitedly saying ‘hey – I have the opportunity to work the way I want to!’, that is not normal.

“Usually, in any multi-user system, you are stuck with compromises and workarounds – someone else decides how you should work,” he said. “When you can decide how to work, to set up your own workflows, then you are moving something culturally. That is what nxtedition gives us.”

They described it as a “process of discovery”, developing their own best ways of working. Immediately, though, there was a clear increase in productivity, not least because it is intuitive in the extreme. Comprehensive training for a journalist takes just two hours; a nxtpert (nxtedition expert) can learn everything about nxtedition in 24 hours.

One of the ways in which KNR developed their approach to the system is that they started with the philosophy that “everything is in nxt”. But they quickly realised that it did not need to work that way, if there was a better way of doing things. Craft editing is a great example: nxtedition makes it simple to hand files over to an external editor and take finished packages back in.

The single system approach means that, whatever sort of file you have in nxtedition, anyone can see and use it. So, television producers can look into the radio newsroom to see if there is content they can use; web editors can draw the best of the content into the online offering. “It’s another example of the cultural change,” Larsen reflected. “Now our approach is not ‘let’s make me look good’, it’s ‘let’s make everybody good’.”

In a country with few roads, where most population centres can only be reached by boat or plane, the capabilities for remote working are always a top priority. So, the coronavirus lockdown was simple to accommodate. “It was no problem at all to go over to working from home,” Larsen said. “We can run the studio from anywhere. We have got to the point where everything is down to connections – if you have broadband you can do anything.”

For KNR, nxtedition provides pre-production, planning and calendars, content ingest, media management, script writing and editing, graphics, prompting, live studio automation and social media. User interfaces are customised by the individual and by the task to keep things clean and simple.

Continuing collaboration between KNR and nxtedition means that the system continues to grow. “They are very good at handling the management of development,” Larsen concluded. “They understand the difference between customisation and development of the product base.

“And they helped us to understand that we should keep looking for ways to do things better, in a more productive way. In a well thought-out software system, you are not implementing something, you are continuing development.”

Member Speak – Never.no

Scott Davies 

CEO

Tell us about the company – when it was founded, by whom and with what objective

Never.no was founded in 1999, by Lar Laurizson, a Norwegian creative genius. The company was originally a technology solutions business, with a crack-team of coders providing software for flows and managing data. Some of this would go into traditional development, such as website build, but in essence the approach was about how to improve digital delivery in general.

The company slowly evolved into the broadcast sector, where the initial ideas of what our content management platform, Bee-On, is now - focusing on audience engaged formats. Never.no were the creators of the very first automated music jukeboxes, people could pick up the phone and send an SMS to vote for a music track, which would influence the end result, in real-time. It’s a precursor to where we are now in terms of developing an easy-to-use platform for data management, particularly social, and publishing into or changing broadcast graphics in real-time.

Fill us in with how the company has developed and grown to the present day

Skipping ahead, Never.no has certainly been through a series of changes over the years, to the point where the team are using their long-established experience and expertise in audience engaged services to create a platform launched six years ago, originally called STORY, and now Bee-On, following the brand-change last year.

Our team have specifically grown off the back of a range of world leading projects. Manipulating real-time graphics with data and live broadcasting – driving interactive jukeboxes and games well before mainstream social media interaction – enabling us to understand what audiences want and how the broadcast industry works. We’re proud of the latest development, Bee-On, which gives content providers the tools to manage rafts of social data, including live content, polling, trends, pictures, videos, and competitions to moderate and publish into real-time graphics, across multiple platforms, via traditional broadcast or digital.

To-date, our staff spans across the globe, managed from our base in Manchester, supported very closely by our development team in Oslo, Norway, and reaching to Sydney, Australia and North America. One of the most notable milestones is seeing the established broadcasters, such as Viacom, BBC, SBS, Sky, Al Jazeera and many others across Asia, and the Middle East, working with our platform – that’s a true seal of approval from the industry.

What is your secret sauce - why do your customers choose you over your competitors?

It’s quite a mix of ingredients, the tools and the people behind it are the main factor. The feedback we get from clients is firstly the second to none support, but also the robustness of the Bee-On platform, which is needed in such a demanding sector, where there’s no room for downtime, not even a split second! The dedicated development team keep their fingers on the pulse and constantly update API’s to ensure connectivity with social media platforms, and frequently offer new tools such as Chrome extensions, Twitter Trends and more integrated messaging features. It’s important to offer an end-to-end service alongside a SaaS solution; our team have a mix of experience from broadcast, sports, brand marketing, and everything in between, so they have an understanding of what viewers want and can support with a content-based, and technical background.

What has been your biggest successes and challenges in 2020?

We can safely say – “wow, what a year”! It’s been difficult for all, but we’ve adapted as an industry and seem to be going into 2021 with a positive outlook. Both viewing trends and production workflows have changed exponentially, so we’ve all had to be proactive and create new ways of working.  As a cloud-based solution we’ve enabled content providers and brands across multiple platforms work remotely to create high-quality, captivating content.

It’s been a proud moment for us, helping traditional broadcasters and streamers bridge the gap between content and the audience stuck at home, supporting shows like ITV’s The Martin Lewis Money Show, developing an innovative front-end interactive interface for Telemundo’s US election coverage, and enhancing streamed content for Sky’s weekly Portrait Artist of the Week show on Facebook Live. We’ve also expanded our client-base within sports, as Premier League and A League football clubs have captured the power of fan engagement and produced match-day shows streamed on social, reaching fans globally beyond standard broadcasting, featuring cloud graphics, audience generated content and real-time polls!

What are the company’s goals for 2021?

By December 2021 we expect to be way-past the pandemic and working within a new progressive and open-minded industry. As mentioned, we’ve had to adapt in so many new and innovative ways and meet rapidly evolving trends with media workflows, content creation and media consumption. We have exciting plans for our cloud-based content management platform Bee-On, that will continue to support broadcasters, digital content providers and brands’ needs to enhance audience engagement like never before. 2020 was the catalyst for the adoption of cloud-based solutions, so we expect to see an almost fully virtualised and connected industry, this time next year. As a business, we’re looking to make strides in the US, and consolidate our work with traditional broadcasters, plus build on the success of supporting content providers using digital platforms.

What trends should we look out for next year?

We’ll soon see audiences back in the studios and fans in stadiums, but the last eight months will have shown producers how they can connect with them on a global scale. We expect to see audience engaged content becoming more prevalent in live and pre-recorded programming, across platforms – all powered by cloud-based solutions. Content will evolve to include complementary technologies such as Augmented Reality; add social and real-time data to that mix, then it becomes a fabric of the show, rather than just a bolt-on and you’ll see more interactive and immersive socially-led content. Furthermore, 5G will immediately enhance acquisition and delivery of content beyond the primary screen, enhancing viewer experience more than ever.

As an IABM member, what services do you most value and why?

We took the decision to join the IABM in the wake of lock down, with the cancellation of global industry-wide events; it was important for us to stay connected and ensure we had the opportunities to network and collaborate. IABM have given us the platform to do exactly that, stepping up and driving worthwhile virtual events suitable for everyone! We’re looking forward to sponsoring December’s BaM Live!™ event and building our relationship with IABM and fellow members in 2021.