Remember in the early days of the Internet when you would walk into a company’s IT department and the team would proudly show you around the crown jewels, the server room?
After signing in and walking your shoes over the tacky mat to remove the dust, the team would regale with great delight the hours of UPS available, the cooling system capacity and you would nod in appreciation of how well color-coded and organized their ethernet cables were. The team had purchased a bunch of hardware, deployed it and, were suitably proud of their shiny boxes.
Fast forward 20 years or so and very few IT departments will show you around the server room and fewer still have that glass panel into the server room from the front foyer exposing how high tech they are by display of metal, plastic, and blinking lights. We don’t care how many servers you have because now we live in a software-centric world.
As summarized nicely from Akamai’s recent quarterly conference call in Lightreading’s recent article Tom Leighton of Akamai essentially claims that Akamai dominates the edge because they have 300,000 servers. Clearly Akamai has bought a lot of boxes over the years and no doubt has all the cabling nice and tidy but owning your own servers is not what is going to power the Edge of the Internet. Software is where it’s at.
Should physical Edge networks be capable of driving deep into the Edge? Absolutely. Many infrastructure providers are building systems and capabilities to provide edge infrastructure which we target with Edge software. AWS Outposts, Vapor.io and the 5G networks, CenturyLink’s edge play, and more, mean there is plenty of deep Edge infrastructure to target before we even think about Edge running on-premises. These environments will be open and accessible as compared with those racks of servers locked down to legacy CDNs.
All of these locations are valid targets for Edge workload but not all of them need to be “lit up” for every application all of the time. It may make more sense to run a subset of locations for an application at any one time to deliver the optimal performance and cost outcome for that application. Take a simplified view of object caching for example; Does 300,000 servers mean 300,000 empty caches that need to be filled before they become useful?
The Cloud computing movement and the subsequent trend to cloud-native software means we can move software around from computer to computer and reroute requests both predictively and in real-time much more easily than ever before. Physical location is no longer a matter of owning a box in a particular location at all times.
We can now move software to locations for the right amount of time for that application. Intelligent edge network management and orchestration in a modern, cloud-native world cloud means being able to address far more than 300,000 servers but at the same time, not needing to keep all those servers on at all times for all customers.
We also need to think about the quality of software we are running in any location. It does not really matter how many servers you have or where they are if those servers are running inferior software. Perhaps smarter software and fewer servers or even locations can provide superior application performance, security and scalability.
Innovation at the software layer will move us faster than owning more boxes. Security software, caching software, edge intelligence software, API gateways, Edge Auth, compute orchestration, Traffic routing, etc etc are all moving so fast that an Edge platform needs to be open enough to move with it. Closed, proprietary software systems like Akamai are getting left behind. As a case in point, the modern Web Application Firewall (WAFs) providers like Signal Sciences, Wallarm and others have dropped significant changes in WAF tech over the last couple of years. They have moved faster than the legacy CDNs can with their older, legacy, rules-based WAFs.
The promise of Edge is more performant, application experiences for less cost. Get the right application components running in the right location for the right users at the right time. Rather than a spray and pray approach, this means smart orchestration to place the best software in the best locations. That may be in the infrastructure edge, the telco edge, on premises or some combination of all of these Edge layers.
I agree that Akamai changed the Internet with its early use case for Edge Compute - CDN. They invested heavily in hardware and have been a major player in the Internet. They will continue to deliver results for large object delivery and file streaming. Applications are a software problem and orchestration of a dynamic Edge is not about who owns the most boxes, the biggest UPS or who has the most organized ethernet cords.