Implementing Service Workers at the Edge

TL;DR: Service workers are a web standard most likely to be found in the browser. With some implementation code, service workers can be adapted to run server side. Server-side service workers can be used to proxy and manipulate the request.

The edge can be divided into two spaces. The device edge and the infrastructure edge. Bounded by the telco concept of the last mile (or the final leg of the telecommunication networks that deliver services to end users), each side identifies an aspect of edge computing. On one side of the edge (the side closest to the users) is what is often referred to as the device edge (these are things like phones, cars, cameras). On the other side (the space between the last mile and cloud provider) is the infrastructure edge.

The infrastructure edge isn’t new. It’s long been the space of traditional Content Delivery Networks (CDNs). These are generally caching/security solutions globally distributed to improve the performance, availability, and security posture of web applications. Recently, there have been interesting moves in the infrastructure edge, one of which is the notion of applying service workers to the server to create a serverless solution.

Cloudflare Workers® is one such example. A fantastic bit of infrastructure edge tech, Cloudflare workers allows developers to write JavaScript functions that run on Cloudflare’s global network of points of presence (PoPs). By leveraging the V8 engine (specifically, V8 isolates), Cloudflare Workers allow you to implement service workers and run them on their infrastructure.

The Evolution

While not a new idea, Cloudflare was the first to apply the idea to the server and create a serverless approach to compute that leverages the Service Worker standard. Serverless, made popular by AWS Lambda, allows the developer of a function to focus on business logic and leave the operation/orchestration of the function to the provider (in this case, the Cloudflare network).

To create Cloudflare Workers, Cloudflare used a control plane developed against the Service Worker API designed for the browser. The implementation essentially responds to events as if they were JavaScript Workers. Popular use cases for Cloudflare Workers include modifying requests, implementing routers, A/B testing, rewriting links in HTML and more.

Experimenting with the Service Worker API

So the question becomes, if Cloudflare Workers is based on a standard (the Service Worker API), can you run the same service workers on another network and provide your own server-side implementation? It turns out (at least after an initial MVP), the answer to that question is yes.

This post is a tracer bullet. It’s a simple MVP of the Service Worker API implemented using Node/Express as the control plane. NOTE: This implementation doesn’t provide the full feature set or options needed for implementing service workers on the server. Instead, it asks the question: Is it possible? Further work is needed to move something like this to production.

In this implementation, we take a Node.js/Express application and dispatch to the Service Worker code via a simple JavaScript wrapper. The wrapper dispatches to the Service Worker using event listeners. For the orchestration of these examples, I use the Node.js module available on Section’s Edge Compute Platform. The Section platform allows containerized workloads to be run at the infrastructure edge. The Node.js container is one of dozens of containers that are available and, in particular, allows developers to build JavaScript applications and deploy them to the edge.

Service Workers

Before we dive in, let’s back up a bit and level set on service workers.

A service worker is JavaScript that operates as a type of event-driven web worker (or JavaScript that runs in a background browser thread). Service workers essentially act as proxy servers that sit between web applications, the browser, and the network (when available). They are intended, among other things, to enable the creation of effective offline experiences, intercept network requests, and take appropriate action based on network availability. Service Workers can be used for push notifications, background sync APIs, and handle network partitions.

I first ran into service workers in a presentation by IIlya Grigorik of Google at Velocity in 2015 (you can watch a video of the presentation here). It should be no surprise that service workers are used extensively in applications ranging from Gmail to Chrome itself.

The goal of service workers is really to run a bit of JavaScript that responds to browser initiated events (like an http GET) rather than user interface initiated ones generated in HTML. By sitting between the page and the server, the service worker can more seamlessly handle things like a network partition (or network outage) and give an application an offline experience.

Once a service worker is made available, three functional events are supported: fetch, sync, and push. For this MVP, we take the Cloudflare hello world application and create our own control plane for running it. We will only be implementing “fetch.” I use Node.js/Express because it’s easy to shape the requests and responses and implement the invocation using event listeners.

Hello World

The canonical hello world example in Node.js/Express maps a route to a function that simply returns a “hello world” string. The actual implementation of Hello World in node.js looks something like this:

1const express = require('express');
2const port = process.env.PORT || 80;
3const app = express();
5app.get('/', (req, res) => res.send('Hello World!'))

We’ll do the same thing for our first Service Worker example, but instead of defining a method that returns a string, we declare and implement the FetchEvent found in the Service Worker standard.

In JavaScript, we just pass along the request object generated in Express and then define an async function required by the spec called respondWith(). The respondWith() method provides an async response to the fetch. It looks like this:

 1app.get('/', async (req,res) => {
 2  let ev = {
 3    request: req,
 4    respondWith: async function(responsePromise) {
 5      responsePromise.then(responseObject => {
 6        // only this one response header supported for PoC
 7        res.set('content-type', responseObject.init.headers['content-type']);
 8        res.send(responsObject.body);
 9      });
10    }
11  };
13  await fetchHandler(ev);

You can see we set a header from the service worker and then send along the body.

Unlike a browser (where the default global scope is the window object), Node.js has a global scope scoped to the module itself. So to attach the newly created event to the global scope (“this”), we define a global addEventListener method with its implementation in the service worker we’ll pass in later.

The await fetchHandler() method in the app.get() function assigns the event we defined with the request and the respondWith() method into that global scope.

In the actual Service Worker code itself, the eventListener is created, and an async method to respond to the request is defined. We use the shape of the Response object defined by the Web APIs for the actual response itself.

 1addEventListener('fetch', event => {
 2  event.respondWith(handleRequest(event.request))
 3  })
 4  /**
 5    * Respond with hello worker text
 6    * @param {Request} request
 7    */
 8  async function handleRequest(request) {
 9    return new Response('Hello worker!', {
10      headers: { 'content-type': 'text/plain' },
11    })
12  }

NOTE: This is the “hello world” example used when you get started with Cloudflare Workers.

The result is a Node wrapper that is called, executes the service worker, and returns “Hello worker!".

hello worker example

Proxy and Alter Request

Okay, so we can get a basic hello world response using the pattern defined by the Service Worker, but can we do more?

In the next example, we’ll define a service worker that uses the Fetch API (as implemented by node-fetch) to retrieve a web resource at:

The implementation is pretty straightforward at this point. We provide an implementation of node-fetch, npm install it, and then require it on the page. We use async/await to get the content from the backend page using the Fetch API.

The Service Worker implementation looks like this:

 1const fetch = require('node-fetch');
 3addEventListener('fetch', event => {
 4  event.respondWith(handleRequest(event.request))
 5  })
 7  async function handleRequest(request) {
 8    const uri = '';
 9    console.log("About to call: " + uri);
11    return new Response(await getContent(uri),{
12      headers: { 'content-type': 'text/html;charset=UTF-8' },
13      status: "OK",
14      statusText: 200
15    });
16  }
18  async function getContent(uri) {
19    let response = await fetch(uri);
20    let data = await response.text()
21    console.log("data:" + data);
22    return data;
23  }

The result is the contents of the page are returned as if it was requested directly. Here’s the page we’re fetching:

proxy alter request

Because this Service Worker code sits in the request chain, we can also do things like manipulate the request, add headers, or rewrite portions of the response. Here is an example using the same code to do a simple replace on the body tag and inject a new CSS class.

 1async function handleRequest(request) {
 2  const uri = '';
 3  console.log("About to call: " + uri);
 4  let body = await getContent(uri);
 6  const modified_body = body.replace(
 7    "<body",
 8    "<body class=\"dark\"");
10  return new Response(modified_body,{
11    headers: { 'content-type': 'text/html;charset=UTF-8' },
12    status: "OK",
13    statusText: 200
14  });

alter request css manipulation


Service Workers is a web standard that’s been around for a while. It’s been a really effective tool in the web developer’s toolkit to be able to create an intermediary between the webpage, the site, and the network.

Cloudflare has been innovative in implementing a server side implementation to run these workers within their platform using V8. It’s given them the ability to provide a serverless solution on top of the Cloudflare platform to do things like manipulate the request and run code at the infrastructure layer. While there are other ways, using Node directly, Lua with OpenResty, or even Nginx with the OpenResty Module to do the same thing, it’s an innovative way to bring more control to developers at the infrastructure edge.

As the Service Worker is a web standard, software engineers can also create harnesses to be able to run these service workers as well. This MVP demonstrates how Node/Express can be used to model the control plane required to build one. While there’s work to be done before an MVP like this can be called production ready (if such a thing was desired), it is possible to implement. This MVP shows a rough idea of how a harness might be implemented using Node.

To learn more about Section, the Node.js module that I used to create the harness for the Service Worker, or any of the other BOT/WAF/custom containers that can be deployed on the Section Edge Network, please contact us and speak to one of our engineers about your use case.

Wesley Reisz is VP of Technology at Section, Chair of the LF Edge Landscape Working Group, Chair of QCon San Francisco, & Co-Host of The InfoQ Podcast. Wes enjoys driving innovation in the edge ecosystem through awareness, community, and technology.

Similar Articles