In this blog, we will look at the journey in application development from monolith to microservices to edge, the drivers behind it, and why using microservices at scale is the first step towards technical evolution at the edge.
From Monolith to Microservices…
Monoliths
In a monolith application, a single unit of deployment handles multiple types of business activity with the front-end and back-end being tightly coupled. Often monolithic applications are a good way to get an application started, but can become unwieldy as businesses become more established and start to scale. IBM describes monoliths in these terms: “The application that grew over time, became unmanageable and difficult to understand, and has low cohesion and high coupling.”
Typically, businesses find that as their user base grows, more innovation, new features and more integrations are required. The monolithic approach then becomes a bottleneck to growth due to:
- The core code base becoming highly complex
- A long time to market
- A steep learning curve for new developers
- Large dependencies
- Longer deployment time
Because of these drawbacks, businesses are increasingly breaking monolithic technology stacks down into microservices.
Microservices
Taking a microservices-based approach to application design means that each core business capability can be deployed as its own separate unit, which performs only a single function. Working in this way gives engineering teams the flexibility to organize their code in step with the logic of the business, allowing separate components of the system to be developed and scaled at different times and by different teams.
There are numerous drivers of the shift from monolithic applications to microservices-based applications. These include:
- An industry-wide shift from on-premises infrastructure to cloud/edge.
- The evolution of VMs to containers.
- Open-source tools and cloud native services evolving to meet developer need for microservices-oriented architectures.
In addition, industry leaders who have undergone the journey from monolith to microservices have frequently been vocal about sharing their migration stories, challenges and expertise. Industry titans who have shared their monolith to microservices story include:
- Netflix - who experienced significant benefits in performance, development and scalability.
- Google - “Google encouraged engineers to try and do something that was audacious, and that led to a lot of the systems that they created”.
- Amazon - within a year of migrating to the AWS cloud, engineers were deploying code every 11.7 seconds on average.
These transparent accounts of their successes and failures have helped the wider community more easily accept and overcome the challenges around the use and development of microservices.
The benefits of microservices include:
Logic that follows business capabilities.
In his seminal blog post on microservices, Martin Fowler highlights the way in which microservices enable the building of products versus projects. Development teams are organized around business capabilities as opposed to technologies, meaning that services can be adapted for use in different contexts. This level of code reusability offers the flexibility to rearrange services and their functionalities, allowing the same data to be processed by different services and teams.
Applications are easier to build and maintain.
The agility that a microservices-based approach offers is often a key draw for developers. Service boundaries are explicit when each application is split into a series of smaller, composable fragments. Managing code is easier since each microservice is its own separate chunk of code. Services can be implemented using a wide range of functional programming paradigms and frameworks. Additionally, development is language-agnostic. Again, this means that each service can be deployed, built, redeployed and maintained independently of one another.
Designed for failure.
Since they are loosely coupled, microservices can be independently tested and deployed. According to Google, “The smaller the unit of deployment, the easier the deployment.” Clearer boundaries mean that if one service fails, only that one function goes down, as opposed to the entire application.
Nonetheless, there are challenges with a microservices-based approach, which include:
Increased complexity of the overall system.
Since a microservice-based application is a network of different services, they typically interact in ways that aren’t predictable. Therefore, both the increased number of services and their interactions mean that the overall complexity of the system grows.
Communication and potential security challenges.
Microservices communicate over a network compared to internal containment in a monolith. This can lead to communication challenges and can introduce new security challenges.
Potential performance degradation.
Performance can suffer compared to “a monolithic approach because of latencies between services” (Google). This can significantly undermine the benefits of microservices. However, there are solutions to help address these challenges, as we’ll see in the next section.
… And From Microservices to Edge
It is becoming increasingly evident that edge computing offers further opportunity to optimize performance and management for each service within a microservices-based application, and help overcome the associated challenges.
Why the Edge benefits from microservices
The growing importance of n-layer as opposed to flat architectures is being driven by latency and data constraints. Edge computing brings into play aggregation layers because it involves hierarchical computing, orchestration and placement of workloads. This is very different to the standard approach today of delivering microservices by flat hyperscalers.
Not every workload in an application is suitable for edge deployment. Hence, having an application built of smaller composable units offers much greater flexibility in being able to migrate only the workloads that require the benefits that the edge offers, while still working in concert with cloud-based microservices.
Why microservices benefit from edge computing
Compared to monolithic application development, microservices have an increased level of network communication since everything relies on network calls to one another for the linking of services. Therefore, benefits of edge computing for microservices include:
- Performance. The lower latency that edge computing provides helps microservices perform better.
- Bandwidth. The amount of bandwidth required by many IoT devices is enormous (particularly for streaming data). Local processing saves time and reduces the strain on Internet infrastructure.
- Improved availability. Edge routing based on user location, client type and other factors, eliminates the need for redundant load balancing in availability zones.
- Statefulness. Compute, user delay, and complexity are reduced by running stateful routing logic of dynamic requests at the edge.
- Security and compliance. Processing data at the edge, not sending it back to the cloud, can increase privacy and help aid compliance procedures, such as local storage of data, which is often required by data privacy laws.
- Shift to multi-cloud/hybrid IT infrastructure. Modern applications are increasingly adopting a microservices-based approach to help them more easily span more than one cloud provider or consume services from different clouds and/or edge providers. Kubernetes clusters, for instance, are often used to simplify multi-cloud/hybrid IT management.
The Role of Edge as a Service in the Microservices Journey
With workforces looking set to continue working remotely for some time to come, if not on a sustained basis; Internet infrastructures worldwide under increased strain; and the volume of data needing crunching continuously increasing, the game-changing combination of microservices and edge computing is looking to be increasingly essential.
We anticipate more and more organizations looking to leverage microservices at the edge. For many (if not most), building a home-grown edge workload orchestration system is not feasible. This is the critical role that Edge as a Service (EaaS) plays in the acceleration of edge computing adoption. For organizations that lack the skill set internally to deploy microservices over the edge, EaaS can provide turnkey solutions. Edge as a Service (EaaS) allows engineers time to focus on innovation and developing the core business while the EaaS provider manages the complexities associated with edge deployment of microservices.