Container technology derives its name from the shipping industry. As opposed to transporting goods as individual units with various packed sizes, goods are placed into steel containers, which are a standardized size, allowing for easier storage and more seamless transportation. The container can be moved as a unit, which saves time and money.
In the tech world, containerization is a method used to package an application allowing the software, and its dependencies (including libraries, binaries and configuration files) to run together, isolated from other processes. By packaging them into a container, they can migrate as a unit, avoiding the differences between machines such as underlying hardware or OS differences, which can cause incompatibilities and critical errors. Containers also help enable smoother deployment of software to a server or network of servers.
Container technology gathered momentum in the 2000s with the introduction of Free BSD Jails. However, it wasn’t until the introduction of the container management systems Docker Swarm and Apache Mesos that containerization really began to take hold within the wider industry. When Kubernetes was released in 2017, it quickly became the de-facto standard for container management systems because it made container orchestration significantly easier to manage and more efficient.
Containers as a Service (CaaS)
In 2020, a survey conducted by the Cloud Native Computing Foundation (CNCF) found that 84% of respondents were running containers in production. With containers now widely considered a standard unit of deployment, many organizations have adopted Containers as a Service (CaaS) solutions to streamline their container orchestration (i.e. Kubernetes) operations.
What is CaaS? It is a service model that enables users to manage and scale containers, applications and clusters. It does this through a container-based virtualization, API, or a web portal interface. While there are different types of implementations available, all CaaS offerings essentially do the same thing. That is, they help organizations manage their containerized applications in a safe, scalable manner, whether on the cloud, on-prem or (as we’ll go into more) at the Edge.
Benefits of CaaS include:
- Faster time to market - due to a more streamlined development process (it literally takes seconds to create, start, replicate or destroy a container).
- Standardization and portability - allowing you to run your containers across different clouds, avoiding vendor lock-in.
- Quicker deployments - CaaS abstracts the details of the underlying infrastructure.
- Cost-effectiveness - Costs are typically lower since users choose and pay for only the CaaS resources they need.
- Security - As a result of the isolation that containers have from each other, security is naturally baked in. CaaS also makes it easier to manage your host system and launch updates and security patches in a timely way.
Containers at the Edge
Amongst other benefits, edge computing offers lower latency, reduced data backhaul, and higher availability for Internet-independent applications.
Containerization has quickly become an important part of edge computing. Since containers are a lightweight, portable virtualization solution, they have various logistical benefits within a distributed compute model.
Edge computing requires time- and cost-efficient deployment and management of an application to many locations and possibly many underlying compute types. Provided a suitable edge orchestration framework is present, containers have two key factors which make them suitable for Edge:
- The portability of containers make them suitable for edge applications as the applications can be deployed in a distributed fashion without requiring significant rearchitecting of the underlying application.
- The abstraction of containers make them suitable for deployment to non-homogenous federated compute which is often found at the Edge.
Many solution providers have already containerized (i.e. Dockerized) all or portions of their applications, making it easier to migrate them to the Edge.
The Role of Kubernetes in Edge Computing
Since one of the key strengths of Kubernetes is its ability to offer a common layer of abstraction atop physical resources (compute, storage, networking), Kubernetes is a useful tool for developers and operations teams to leverage for deployment of applications and services in a standardized fashion on disparate compute, including at the Edge. This is important in the cloud, but it is critical at the Edge because of the much greater diversity and volume of hardware and software resources. To effectively manage edge nodes, enterprises need a management layer that enables dynamic orchestration and scalability, which Kubernetes provides.
At Section, we migrated to Kubernetes from our custom-built orchestration framework a few years ago. With Kubernetes as our backbone, a few of the benefits we have experienced first-hand include higher availability of services, fewer interruptions during upgrades and patches, and flexible tooling. Additionally, it gives us the ability to remain infrastructure agnostic, which translates to greater flexibility and reach with our global edge network options.
The endpoint orchestration which Kubernetes facilitates is an important part of edge computing. However, it must be augmented with three fundamental layers:
- Overall distributed edge orchestration
- Edge application development lifecycle and deployment framework
- Edge traffic management and routing
Why Edge as a Service?
Deploying and operating applications at the Edge requires a dynamic, cohesive system which manages distributed traffic routing and endpoint orchestration while simultaneously providing a seamless and simple developer and operations team experience and delivering a secure, reliable target deployment network. Expertise, planning, and continuous monitoring are non-negotiables when it comes to having containers spread across different regions.
Edge as a Service (Eaas) handles the complexities of having to manage multi-cloud/edge deployments, including CaaS. Familiar development workflows make it as easy to deploy to the edge as developers have become accustomed to with cloud. One of the primary benefits is the orchestration of workloads to meet real-time traffic demands, maximizing efficiencies so you’re not running all workloads in all locations at all times.
Use Case for CaaS at the Edge: SaaS Cloud-Native Deployment Model
Wallarm, a leading WAF solution provider, came to Section with the goal of being able to extend a cloud-native deployment model for their customers. Their traditional deployment model required customers to install the solution in their centralized infrastructure, which introduced operational burden and slowed their time to value.
With a containerized version of their application, Wallarm has been able to leverage Section’s Edge as a Service to build out the Wallarm Cloud WAF solution. By building on top of Section, Wallarm doesn’t have to think about managing the underlying infrastructure layer and is able to take advantage of Section’s global edge network to deliver more value for their customers. On top of that, Wallarm customers can go-live with a distributed WAF in minutes via a simple DNS change.
To break down the benefits of the distributed deployment model even further, let’s consider a simple example. Imagine a SaaS solution (like Wallarm) which may be traditionally deployed in a customer’s AWS-East instance, but their application serves a significant portion of traffic in the APAC region. The distance that traffic has to traverse over the network introduces latency issues for end users accessing that application. When you distribute the workload across a global footprint (via EaaS), you’re instantly able to deploy closer to end users, wherever they may be, resulting in significant performance gains.
For SaaS providers to directly build this same edge footprint, it would require a large investment in initial infrastructure, not to mention the ongoing management of operations, which requires a huge amount of specialist expertise. Edge as a Service provides a turnkey solution allowing organizations to focus on their core products.