Cache Hit Ratio: The Key Metric for Happier Users and Lower Expenses
When it comes to streaming, two factors can make or break a business: cost efficiency and seamless user experiences. The two are closely tied: End users are looking for quick page load times, minimal time to live, and reliable bandwidth without buffering. Businesses are charged not only with maintaining that level of customer satisfaction, but also with considering compute costs and the egress costs charged by cloud providers.
One of the best ways to achieve success with both of those factors is by having a high cache hit ratio, which measures how effective a cache is at fulfilling content requests. It’s calculated by dividing the number of cache hits by the total number of cache hits and misses.
The higher the ratio, the better. And finding the right balance is dependent on multiple factors like content delivery systems (CDNs), caching systems, cache memory, access patterns, and more.
The costly challenge of hyperscalers
Many streaming services rely on centralized cloud infrastructures, or hyperscalers, to store and deliver content. And while these systems are great for compute and storage, egress costs can become a financial drain, especially when an ineffective cache leads to latency, longer load times, and lower cache hit ratios.
Hyperscalers can lead to inefficiencies because they typically store cached content in a limited number of remote data centers. When streaming services transfer a large volume of data , it can be costly to keep retrieving it from its origin — leading to lower cache hit ratios.
For example, the transfer of 1 petabyte of data from a hyperscaler to the internet can cost tens of thousands of dollars, and those costs only grow as the number of users increases. As a result, profit margins decline and providers have little choice but to increase subscription costs. And, if users are already frustrated by buffering and latency issues, they aren’t likely to react well to having to pay more.
Edge delivery is the best defense against cache misses
One of the most effective ways to enhance streaming quality without sacrificing cost or performance metrics is to deliver the content from an edge location: a data center (often located much closer to end users) that avoids cache misses by delivering content with lower latency versus retrieving content from an origin server.
Because it already has copies of the requested content saved, edge delivery — powered by platforms like Akamai Connected Cloud — optimizes cache hit ratios by reducing requests to the origin server, minimizing bandwidth needs, and cutting egress costs. This approach not only saves money, but also improves the user experience with faster page loads and buffer-free streams.
CDNs are an important ingredient in the edge recipe
An edge-based caching strategy can be further enhanced by the addition of a content delivery network (CDN) such as Akamai CDN, which operates on a vast network of edge servers. But it’s about more than just cache control. When combined with Akamai Connected Cloud, this type of infrastructure can also place compute, storage, and database services closer to end users.
The key to success, however, is to not overextend the use of a CDN. Some companies look toward multi-CDN architectures to speed up streaming capabilities, but this type of framework can actually have the opposite effect. If not implemented properly, using multiple CDNs can further decrease the cache hit ratio. Multiply egress costs by the number of CDNs in use, and the expense quickly grows.
Rather than spend on technical expertise to optimize a complicated streaming media workflow, providers can take advantage of a distributed cloud computing platform to:
- Serve more users without scaling up infrastructure
- Slash latency for superior playback quality and website performance
- Create standout experiences instead of worrying about technical roadblocks
Laminar unlocks a 99.7% cache hit ratio
Laminar is one company with real-world experience in balancing the low cache hit ratio with sky-high egress costs. With media companies worldwide relying on its no-code platform to power seamless streaming across devices, the company sees that rarely accessed, long-tail content often doesn’t make it to edge servers.
The result is a flood of cache misses, which the company has coined a “cache-22” loop. Here’s how it looks:
Low cache hit ratios cause bad streaming experiences (buffering and slow delivery).
Frustrated users leave, making it harder for media companies to grow their audiences.
Without a larger audience, improving cache hit ratios — and revenues — becomes even harder.
To break the cache-22 loop, Laminar partnered with Akamai to optimize its caching strategy. By analyzing logs and implementing targeted adjustments, it achieved a staggering 99.7% cache hit ratio. This means that its customers can now deliver lightning-fast, buffer-free streams and confidently scale their audiences by wowing them with outstanding experiences.
A three-step strategy for success
Laminar found success with a three-pronged strategy:
Fixing prefetching: Fine-tuning its prefetching settings by correcting mismatched segment naming patterns ensured that video segments were ready at the edge when users needed them, reducing delivery times.
Optimizing content parameters: Recalibrating DASH segment sizes eliminated unnecessary origin requests, which improved efficiency and reduced latency.
Ensuring global delivery: Adjusting client configurations for consistent performance worldwide allowed service for a global audience.
Akamai Cloud Wrapper further improves cache hit ratios
Along with a smart caching strategy, Laminar reduced egress costs by implementing Akamai Cloud Wrapper, a custom caching layer that seamlessly connects public cloud infrastructure with Akamai Connected Cloud.
Cloud Wrapper enables high cache hit ratios, especially for long-tail content, through several key mechanisms:
Long-tail content optimization: Cloud Wrapper is particularly effective for long-tail content, which typically consists of a small number of requests for content with a relatively long life. It maintains cache even for infrequently accessed content, avoiding cache misses for these objects.
Centralized caching tier: Cloud Wrapper provides a dedicated caching footprint for each Akamai customer, acting as an additional multicloud and multi-CDN caching tier. This centralized layer caches objects delivered by any CDN, effectively enabling cache sharing across CDNs.
Intelligent caching decisions: Cloud Wrapper uses consistent hashing to ensure specific video files are always fetched from the same regions, and it considers headers and query strings in caching decisions. This approach maximizes offload and reduces cloud resource consumption.
Adaptive replication for dynamic content: As content popularity increases, Cloud Wrapper adapts by replicating objects more frequently across distinct, logically grouped regions. This ensures high availability and improves cache hit ratios, even as content trends evolve.
By implementing these strategies, Akamai Cloud Wrapper significantly improves cache hit ratios, reduces origin requests, and lowers cloud egress costs, particularly benefiting long-tail content delivery.
As Akamai Enterprise Architect Michal Krobicki observes, “Laminar's use of Akamai CDN and Cloud Wrapper has led to a high cache hit ratio, and their implementation is excellent. Akamai Professional Services can engage customers, helping to tune the performance. Having reviewed Laminar’s configurations, they are in tip-top shape."
Get on the sustainable path to streaming success
Laminar used Akamai’s tools and expertise to crack the code on continuous improvement. The company’s approach highlights the importance of ongoing tuning and monitoring to maintain top-tier performance.
Other streaming providers have also achieved similar results. For instance, managed media service provider G&L slashed egress costs by up to 90% with Akamai.
Akamai’s Professional Services, (including performance-tuning assessments) and advanced observability tools like TrafficPeak can optimize cache hit ratios and reduce costs for companies of varied sizes.
Learn more
Redefine streaming with optimized cache hit ratios and smarter cost management by learning more about:
- The Laminar–Akamai partnership
- Laminar’s experience of increasing its cache hit ratio
- How G&L dramatically reduced egress costs with Akamai
- Efficient caching with Akamai Cloud Wrapper and Object Storage
- Analyzing large volumes of CDN logs in real time with Akamai TrafficPeak