Highlights from the 2021 State of the Edge Report

March 18, 2021

In this blog, we highlight some of the key findings from the 2021 State of the Edge Report produced by the State of the Edge project within LF Edge. The only collaborative report of its kind in the industry, this year’s edition is organized around four areas of innovation: critical infrastructure, hardware, networks and networking, and software. After a quick overview of key findings, we’ll take a closer look at the software at the edge section.

Overview: Key Findings

“The edge, with all of its complexities has become a fast-moving, forceful and demanding industry in its own right.” - Matt Trifiro, CMO, Vapor IO and Jacob Smith, VP Strategy and Marketing, Equinix

Some of the key findings in this year’s highly anticipated report include:

  • Despite COVID, and “in some cases driven by it”, the deployment of new edge infrastructure and applications did not stop across 2020. In fact, seven out of ten areas saw increased forecasts over the last year of lockdown.
  • Between 2019-2028, LF Edge predicts that some $800 billion USD will be spent on new and replacement IT server equipment and edge computing facilities. Tolaga Research, which led the market forecasting for the report, says this colossal investment in edge infrastructure will be necessary to support continually growing demand from edge devices and infrastructure.
  • Accordingly, infrastructure edge deployments will see a hefty increase in the global IT power footprint over the next several years. It is conservatively forecasted to grow from 1 GW in 2019 to over 40 GW by 2028. Thinking about this from a sustainability perspective now is critical.
  • As opposed to an Internet of Things, we should be thinking instead about an “Internet of Systems”, in which devices serving different vertical applications within different systems must communicate directly with each other to exchange knowledge. This needs to be achieved “autonomously and securely with no single point of failure”.
  • Traditional security policies are typically put into place using vendor-specific point solutions. However, this is no longer viable at the edge due to scalability needs and potential vulnerabilities, particularly at scale.
  • Next-gen Software-Defined Wide Area Networking (SD-WAN) tooling and the Secure Access Service Edge (SASE) framework will amp up, bringing security, resiliency and session-awareness to enterprise connectivity. SASE has a cloud native design and is capable of integrating networking and security capabilities into a single architecture through management of the connections between individual endpoints and service edge nodes.
  • Hybrid IT, “the edge cloud construct” is increasingly viewed as an essential enabler for the “Fourth Industrial Revolution’'. This will involve the increased use of IoT, the rise in the global sharing economy and the growth of zero marginal cost manufacturing to deliver “unprecedented communication-driven opportunities with massive economies of scale.”

Software at the Edge

Software is central to edge computing. It enables application delivery, management of edge hardware, and facilitates how workloads move around networks.

In some regards, developing and running software at the edge is comparable to other dev experiences, but perhaps the greatest difference is the sheer diversity of types of edge workload and environment to program for and manage.

According to technology journalist Simon Bisson, “The ecosystem is building a new stack to run at the edge of our networks, taking lessons from the hyperscale cloud, from IoT, from metro data centers, and from content delivery networks and the web, mashing them all together and building something new to suit new hardware, new networks, and a new generation of applications.”

Five key trends outlined in the report that developers working at the edge should take note of are as follows:

1. Open source is driving innovation at the Edge

We are seeing a huge wave of innovation in software development, much of it open source. This is enabling visions of a near-term future in which “hybrid edge clouds will enable API-first microservices with integrated services for authentication, authorization and identity management regardless of device type, operating system and network.” This type of integration across different software systems will enable greater innovation through the development of an open ecosystem, which can deliver scalability, extensibility and interoperability.

2. Code needs to be more portable and scalable

For developers working at the edge, State of the Edge suggests that the “one message” for developers looking at moving to the edge is to explore ways of making code more portable and scalable by:

  • Recognizing that code at the edge needs to run on a diverse mix of hardware in a diverse mix of locations, ranging from servers in remote field locations to microcontrollers in devices on customer premises.
  • Being aware that code also needs to support workloads migrating from edge node to edge node to the hyperscale cloud and back again.
3. Changing patterns are being seen in how users manage systems and software

Management of systems and software at the edge is far from straightforward. State of the Edge breaks the edge stack down into three different layers, each of which requires a different blend of engineering skills:

  • Systems layer - firmware, operating systems and hypervisors: the technologies necessary to work directly with edge hardware.
  • Implementation and management layer - involves the use of tooling that will support modern applications, such as Kubernetes and OpenStack
  • Deployment and operation layer - this top layer enables effective management of distributed applications at scale using methodologies like GitOps and CI/CD.

All layers require an observability layer, which needs to offer information and insights tailored to different stakeholders.

“In order to accelerate edge computing adoption, edge platforms need to remove the burdens for developers and operations engineers when it comes to managing the complexities associated with infrastructure provisioning, workload orchestration, traffic routing, scaling and monitoring, all while minimizing impact on application design. This is the critical role that edge platforms play in the future of the Internet.” Stewart McGrath, CEO and Co-Founder, Section

4. The rise of cloud native platforms at the Edge of the network

Another trend State of the Edge highlights is the development of “easy to install, easy to manage cloud-native platform(s) at the edge of the network.” Vendors are bundling software elements necessary for edge deployment into a single platform, the report notes. The two main objectives are: (i) lower risk for the end user (ii) enable the building and delivery of packaged software that uses a familiar set of tools and methodologies for the developer.

The critical advantage for end users and service providers utilizing this kind of platform is removing the complexities involved in deploying and managing edge hardware. Instead, customers can work directly with the cloud native platform to support the customers’ specific set of edge compute requirements.

5. Running workloads at the Edge requires complex decision-making

Running real-time workloads across distinct and disparate edge infrastructure introduces many complexities to developers and operators. Questions they must continually ask include

  • Which workloads should run where?
  • How should failovers and geo-redundancy be taken care of?
  • How can you maintain continuity and service-guarantees across devices in motion, e.g. a drone or autonomous vehicle?

Latency-critical workloads, such as cloud gaming, real-time IoT analysis, VR, autonomous vehicles, and others are just a few of the use cases driving the need for edge computing. Putting compute as close to the end user as possible in these cases “ensures sufficiently agile responsiveness and reduces the risk of degrading user experience.”

Orchestration solutions are emerging to tackle the complex scheduling challenges involved at the edge. A custom scheduler for edge computing needs to “take into account increasingly sophisticated levels of edge criteria for workload placement, automating decisions in real-time, abstracting away the complexity from developers and operators.”


To read the full report, visit State of the Edge Report 2021