CDNs Were a Prototype for Edge Compute

September 10, 2018

CDNs - An Early Evolution of Edge Compute

The financial and business logic of moving compute to the edge is a powerful incentive. Gartner predicts that by 2022, 50% of computing will be happening at the edge. Currently, around 10% of enterprise-data is created and processed outside a traditional data center or centralized cloud. Santhosh Rao, principal research analyst at Gartner, noted, “Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements.”

CDNs have been a cache-close-by concept from their inception; rather than delivering millions of copies of content to users from a central location, they have always cached popular content in areas where it will likely be consumed. It could be said that when Akamai was founded in 1998 by then MIT graduate student Daniel M. Lewin and MIT applied math professor Tom Leighton, the duo invented not only the first content delivery network, but also a prototype for edge compute. By developing the algorithms necessary for intelligently routing and replicating content over an extended network of distributed servers, they placed content and caching servers closer to end users than ever before. This lessened network congestion and increased the speed of static content delivery.

Legacy CDNS: Talking the Talk, But Not Walking the Walk

In our recent article What and Where is the Edge we discussed the fact that there is no “one edge” to the Internet, While CDNs deliver a more distributed infrastructure, they do not hold ownership of the Edge. CDNs provide an Edge, but they do not provide the one and only Edge.

Further, as we discussed in our article “CDNs Will not Remain Relevant in a Software World” the monolithic and inflexible nature of all CDNs means they cannot provide the software centric tooling engineers need to deploy and manage edge centric software.

Increasingly, legacy CDNs are rebranding themselves as edge platforms. Fastly is now calling itself a “powerful edge cloud platform”; whether or not they have actually changed what they are doing at a fundamental level is more questionable. Even Akamai has rebranded itself to describe itself as an “intelligent edge platform”; however, they haven’t fundamentally changed their infrastructure either. Both are essentially still legacy CDNs operating in the same way as before. As Tom Nolle, President of CIMI Corp., pointed out in a recent blog post on CDNs and edge compute, “We have plenty of announcements about how Vendor X or Vendor Y are moving closer to the edge, but not very many are specific about what they plan on doing there or how they plan on justifying their deployment.”

The Emergence of An Edge Platform

We don’t think the old CDNs are true edge platforms. Rebranding alone cannot turn a CDN into an edge platform. Instead, it is more accurate to say they were an early evolution of Edge Compute . To support the emerging Edge Compute Industry, an edge platform should be able to:

  1. Service and perform compute activity at a range of network locations (or edges); from behind the firewall, through the cloud, within the telcos and even into target networks;
  2. Provide flexibility in choice of software and framework to run at the edge whether proprietary or custom developed; and
  3. Provide engineers with a development lifecycle and operational framework which fits with their Agile practices, CI/CD workflows and DevOps processes.

Using an agile platform like Section offers you the flexibility to customize your network and choose the number and precise locations of PoPs for your application and your needs. Our global PoPs are built in partnership with a range of the world’s largest hosting providers and our users have the advantage of being able to select their own edge network topology. Section also offers the capability to deploy our Edge Compute Platform Platform on-premises or with smaller hosting providers. We offer you genuine flexibility in terms of leveraging both our global network reach and its attendant edge proximity to your end users with a secondary edge nearer the origin, providing maximum offload and scalability.

The modular nature of the Section edge platform allows engineers to drop in their choice of compute activity, including reverse proxy and serverless functions.

As importantly, Section’s git backed application development workflow for the edge offers developers and engineers the opportunity to test out code and make changes within their local dev test and staging environments before deploying to production. Plus, real time, searchable diagnostics (logs and metrics) means Ops Engineering teams have full visibility of edge activity regardless of location.

It is these kinds of innovations that are truly making edge native and edge enhanced applications feasible.