Edge caching and Content Delivery Networks (CDNs) are a critical part of both over-the-top (OTT) and over-the-air (OTA) distribution. They make geographical delivery possible by staging data closer to the user device, while reducing strain on origin servers and long-haul networks. This gives thousands of users efficient access to content with reduced response time in greater metro or regional areas.

For this post, we broadly define edge caching as the intermediate storage of data from an initial data source (origin server) for consumption by devices further from the data source. Data is fetched by the edge cache from the origin server, and subsequent “nearby” devices fetch content from the edge cache. With a CDN, the edge cache serves one or more CDN endpoints, which then serve many devices. This can be for a single asset (such as a static webpage or operating system (OS) update) or for live streaming content. Typically, an edge is non-reciprocal (only serving content, not receiving) and located central to a large population, often covering an entire country (internationally) or several states (US).

Picture1 2


Why edge caching and CDNs?

HTTP(S), the predominant protocol for video and data delivery over internet, uses Transmission Control Protocol (TCP) as the transport layer; Round Trip Time (RTT) is a fundamental factor in determining the maximum bitrate of a TCP connection. Because every transmitted TCP packet needs acknowledgement of receipt (ACK), the higher the RTT, the slower the ACK, the lower the maximum possible bitrate. Without CDNs and edge caching, the distance to the origin server is greater, increasing the RTT and limiting bitrate, which results in reduced speed, regardless of available bandwidth. Whether you are seeking the highest-quality video content, or are waiting for the latest OS update to download, you know speed matters.

Twenty years ago, CDNs solved the problems added by distance through enabling devices to connect closer on the network with lower RTT and higher speed (up to the speed of reasonably available consumer bandwidth at the time), with the ability to scale the number of connections with sudden increases in demand. Because they served the content to the device, CDNs also reduced network stress and bandwidth contention to the origin server. The use of CDNs has dramatically increased over the years. Today they support content distribution for large-scale events, the ubiquity of OTT offerings, and expansive VOD libraries now available, albeit with little to no support of user interaction.

With gigabit home internet and 5G mobile connectivity increasingly available, today’s subscribers have more bandwidth available and expect higher quality no matter when, where, or what device they are on.  Traditional CDNs were not designed to take advantage of these lower latency and higher speed connections to the last mile. Further, for content owners, traditional CDNs have limited visibility into viewing metrics and session insights. (See “For content owners” section for further detail.)

So, how do subscribers and content owners benefit from the lower latency (RTT), ultra high-bandwidth connection offered by 5G? How do content owners react when subscribers resume traveling, yet expect the same high quality they are getting via their home broadband? AWS Wavelength and public 5G connectivity enable, even enhance, the mobile experience proving higher speed, higher quality, and improved subscriber experiences like interactive content.

Caching at the carrier edge

AWS Wavelength is AWS Region infrastructure extended directly into a mobile carrier network, providing access to the breadth and depth of AWS services. Embedded at carriers’ mobile network edge, it enables developers to create applications never before possible, optimizing content and data availability on the same high speed, low latency carrier network as the devices accessing it. More than one Wavelength Zone can be used in larger metropolitan areas, and each carrier has separate Wavelength Zones, which allows content and data to be geo and carrier targeted.

With a decentralized cache further at the edge, deep into carrier networks, subscribers have better user experiences, and faster downloads compared to today’s CDN and edge solutions. Lower latency and higher available throughput of the carrier network, combined with in-network edge compute with AWS Wavelength, allows decreased startup times, higher quality, and enhanced features for video content, immersive and interactive experiences, and improved data downloads such as OS and application updates. Edge caching on AWS Wavelength supports scaling for high connection concurrency: from 100,000+ viewers special events, to the latest software updates, telehealth, and connected automobiles. It further reduces strain on origin servers, peering, and internet links. Content owners have improved visibility into content metrics with better insights of session data. And it’s not just mass viewing or distribution, operating a cache closer to the edge enables origination to occur from the edge.

Edge caching on AWS Wavelength uses Amazon Elastic Compute Cloud (Amazon EC2) compute instances for application, processing, and AI/ML; Amazon Elastic Block Storage (Amazon EBS) (Elastic Block Storage) and Amazon Simple Storage Service (Amazon S3) for local storage; and databases such as Amazon RDS. With the edge cache application and data solution deployed on Wavelength behind an Application Load Balancer, subscribers on the carriers’ network have direct access to the caching front end and data streaming service. Devices download content directly on the carrier network, without exiting to the internet or traversing into the AWS Region. Users interact with applications with negligible latency for an immersive experience. Deploying resources in an AWS Wavelength Zone is as just as easy as deploying in any Availability Zone via the AWS Management Console, CLI, or SDK. Using Amazon Machine Images (AMI) and AWS CloudFormation, content owners can quickly and easily deploy edge cache servers to areas with anticipated high traffic.

Consider a large-scale sporting event with 100,000+ fans traveling to a metropolitan area for several days, all eager to engage with their teams, players, and other fans. They are likely seeking event-specific content like multiple camera angles, on-demand videos, real-time player stats, which venue entrance has the shortest line, and even where to find the best game day food. Fans who don’t have tickets can still enjoy the game with access to ultra low latency streaming content and real-time stats.  This allows all viewers, no matter where they are, to enjoy the thrill of victory and the agony of defeat.

In addition to live, video on demand (VOD) content can pre-stage where it’s needed most, offering fast, predictable performance to, and immersive engagement with their subscribers, while eliminating internet network contention.

Caching at the carrier edge enables innovative user experiences for large-scale video events (sports, special programming, season premier), OS and application updates, automotive and map updates, healthcare, and even broadcast TV and emergency alerting.  And since the edge cache, running on AWS Wavelength, is on the carrier network, the content is available closest to the device with the fewest network hops.

Upstream data and hyper local access

An AWS Wavelength Zone in mobile carrier’s network also enables upstream use cases. Here, content or data originates from a device on the 5G network, such as a video camera or origin server, or autonomous devices, where low latency is critical to successful operation. For video origination, it is useful for hyper local events that require mass distribution. For example, sharing differing camera angles at a sporting event, to disseminating important information about a local disaster to many people at once, notwithstanding long-haul connectivity.  Autonomous devices such as vehicles and robots leverage AI/ML for identity and control, while remote healthcare, from simple telehealth appointments to remote surgery, becomes practical.

For content owners

Operating an edge cache on AWS Wavelength improves the subscriber and content owner experience. Combined with the lower RTT and higher bandwidth offered by 5G services, subscribers see improved quality (4K video and beyond), responsive application performance, and features such as near real-time interaction and AR/VR/XR. Content owners operating their own cache can collect real metrics from both the cache server and the device applications, such as device type, consumption, duration, user experience, chunk size, bitrate, and Adaptive Bit Rate (ABR) ladder switching. Content owners can iterate and monetize against these metrics, continually improving the subscriber experience and properly leveraging content and advertising inventory.

Enhanced cost/value can be realized by limiting the ABR ladder for video content at the edge to only the highest quality; as the devices are connecting much closer, they are able to connect and consume at a higher bitrate. Devices incapable of achieving that bitrate would simply fall back to a traditional CDN. AWS services, or an AWS Partner solution, can programmatically choose the best cache based on location, capability, content availability, and capacity.

As consumers look for differentiated offerings, content providers should consider using solutions such as edge caching on AWS Wavelength combined with 5G services to improve the customer experience, provide greater transparency of viewing and data transfer sessions, while reducing cost of delivery. AWS Wavelength is generally available today in 10 US and three international cities, with more planned through 2021. AWS Partner are already delivering edge caching and other differentiating solutions.  More information on edge caching can be found at AWS Telecom.