Varnish Software – Making streaming more sustainable: three effective methods for achieving more with less
Adrian Herrera, CMO, Varnish Software
Streaming might be our favorite pastime, but beneath the surface, it’s a colossal energy-guzzling process that’s taking a toll on our planet.
Today, the average consumer worldwide spends about 19 hours a week streaming video – but this can be much, much higher for some. And with a population of more than 742,200,000, Europeans could have streamed more than 735 billion hours – or 83 million years – of content in 2022 alone!
To put this into perspective, every hour of video streamed emits roughly 55g of CO2e. This would mean that Europeans streaming habits account to roughly 40.4 million metric tons of CO2e in just one year – the equivalent of driving 210 billion km, given the average gas-powered car emits 192g of CO2e per km.
The truth is, every hour of video streamed pushes a vast amount of tech and hardware into overdrive – data centers, servers, networking gear – you name it. This means that, as our time spent streaming grows, so does our carbon footprint. What’s more, video files are becoming increasingly data-heavy, meaning the amount of energy – and additional hardware – demanded for streaming could grow sizably without corrective action.
While consumers want increasingly data rich experiences, and won’t tolerate subpar quality, they also want to support the environment. The industry is taking notice, with virtually every big conference or major event, like IBC, now aiming their sights to address the streaming sustainability crisis at hand.
Winning “the streaming wars” is no longer just about content – it’s about sustainability and conservation too. To address this, CSPs, video service providers and streamers must implement more energy-efficient methods of streaming, without sacrificing viewing quality. Here are three ways to do so.
Reduce hardware needs with caching
With the 4k market growing at a CAGR of 21.7%, the technology has finally become commonplace, and many are already strategizing on how to support wider use cases of 8K and beyond. Yet, streaming 4k alone requires a bit rate of about 5 times more Mbps than HD.
With more robust resolutions and experiences come greater challenges, especially when providers must scale content delivery at a dime’s notice to meet the needs of thousands, if not millions, of viewers tuning in – from different locations and types of devices.
As expectations for more robust experiences increase, consumer expectations for zero downtime and near-zero latency will remain the same. Therefore, many throw hardware at the prospect of delivering higher quality content, piling on additional servers as a solution to rapidly escalating loads. This doesn’t work long term.
More hardware requires more energy – equating not only to more carbon emissions, but also a larger power bill that will increase exponentially if inefficiencies aren’t tackled at the root.
Rather than spending more on unnecessary hardware, many organizations should first focus on optimizing their legacy equipment to get the most throughput possible out of every watt consumed. Caching is one way to do this.
Caching is essentially a form of digital recycling, which involves reusing previously fetched or computed data for future requests instead of sending the same requests back to an origin server miles away. This reduces unnecessary processing, energy usage, and strain on network resources.
Strategic approaches and technologies for content delivery can drastically enhance throughput and offer more sophisticated control over caching.
For instance, many organizations can utilize free, open source or pre-packaged tools to specify an ideal time-to-live for cached items, or efficiently update and purge items in a cache rapidly as content is modified. Fine tuning traffic routing and load balancing can also ensure content is always delivered from the best location possible.
Migrate to the edge
As data travels farther distances, it consumes more energy and will be delivered with more latency. Therefore, content delivery at the edge is becoming more vital to optimize power usage, performance and costs.
By utilizing the edge, data can be stored and processed closer to consumers, reducing the distance data travels across networks, while adding scalability to content delivery. Utilizing the edge properly via caching also reduces dependence on core data centers and other energy-consuming infrastructure.
As 5G adoption increases, edge-based content delivery solutions will continue to evolve to meet complex requirements on IoT devices and other applications. For example, initiatives like the EBU 5G-EMERGE project are combining satellite communications with advanced web caching and content delivery at the edge. Unique approaches like this can provide greater capacity, scalability and reach, and have the potential to be cleaner than current distribution methods; however, measurement would be needed to prove any benefits.
Use software to optimize
Maximizing throughput per single server is the key to delivering more with less.
Updating to the latest hardware can help reach this goal, but that is costly and may not perform as expected if it is matched with the wrong architecture. Moreover, replacing hardware may result in unnecessary e-waste from discarded equipment, and it is not a scalable approach.
Software-defined solutions are almost always worth exploring to optimize new and old systems for the best Input/Output possible. Software can also be implemented quickly with relatively less upfront investments, improving energy efficiency today and performance long term.
Organizations all have a different combination of infrastructure, which could consist of bare metal, containerized, virtualized or cloud environments that all require a different approach. Therefore, software stacks must be optimized and packaged for the exact needs of the infrastructure at hand, or the use cases involved, such as streaming video live or on-demand, web and API acceleration or private CDN deployments.
For example, when organizations stream video, they must usually process large amounts of data that runs through cloud, data centers and other multiprocessor systems. In these instances, utilizing non-uniform memory access (NUMA) can allow multiple processors to use the same I/O resources and memory locally. This can provide many benefits, including reduced data replication needs, faster data flow with less latency, and more scalable and responsive architecture overall.
NUMA is just one example of a plethora of ways in which software can help extend the lifespan and efficiency of any hardware.
Making streaming greener
At this stage, it’s still difficult to fully quantify the environmental impacts of streaming. But, as digital media continues to evolve, it will become more important than ever to ensure the technology powering and delivering our favorite content is more sustainable.
The industry recognizes this and is already in action. Organizations, like the Greening of Streaming, are already bringing global industry players of all sizes in media, government and technology together to develop common methods of measuring and addressing the impact of streaming on the environment.
Streaming businesses have a clear-cut mission at hand: review their existing systems and design a blueprint to accomplish greater efficiency with fewer resources. The future of streaming relies on their ability to turn the tide.