We are excited to announce Section’s Node.js Edge Hosting solution, which offers new opportunities for application developers to deliver more performant applications at the Edge.
Section’s Node.js Edge Hosting empowers DevOps teams to run mission critical Node.js applications at the network edge for blazingly fast results with enterprise level AppSec protection. Developers can now build server-side applications with the performance of a CDN.
Node.js is optimally suited to build scalable network applications, as it allows many connections to be handled concurrently. The callback is fired after each connection is issued, and will sleep if there is no work to be performed.
Section’s Node.js edge module has a straightforward workflow that makes deploying changes easy, alongside a dynamic visualization so you can continually assess how your application is performing.
Some common use cases for Section users include:
- Server Side Rendering at the Edge (improve SEO, reduce load on origin servers, deliver faster load times)
- Single Page Apps (offload the complexities of hosting your SPA and its assets)
- Micro APIs (build small, targeted APIs for specific use cases and enable faster responses on user queries)
- Static Site Hosting (serve your web application from Section’s Edge)
Why Run Node.js Workloads at the Edge?
There is often a disconnect between writing and running code for developers, especially when it comes to optimizing the distribution of that code. Few developers have experience building distributed systems. Edge compute platforms, like Section, enable developers to migrate more services out of centralized cloud infrastructure to leverage a wealth of benefits (performance, scalability, security, flexibility) along the edge continuum.
By running Node.js workloads as locally as possible at the edge and eliminating unnecessary data exchange between the cloud data center and the end user, latency is lowered and hosting costs are reduced. Edge computing is particularly beneficial for applications and microservices that are latency sensitive and/or latency critical.
Backed by Section’s Adaptive Edge Engine
Section’s Node.js Edge Hosting solution is backed by the patent pending Adaptive Edge Engine (AEE), which intelligently and continuously tunes and reconfigures your edge delivery network to guarantee the optimal compute for your application. In recent tests comparing the performance benefits of running a Node.js application on the Section Edge vs Cloud, the AEE performed up to 7x faster than the cloud.
Supported by a Developer-Friendly CLI Experience
Getting started with Node.js Edge Hosting is fast and easy with the Section CLI (command line interface). Once you’ve downloaded
sectionctl, a series of simple commands allows you to create and deploy a Node.js app on Section’s Edge in minutes.
sectionctl deploy --account-id 1335 --app-id 7171
| APP INSTANCE NAME | APP STATUS | APP PAYLOAD ID | |-------------------|------------|--------------------------------------| | nodejs-flwp6-3b4 | Running | 69be5c29-9f02-41dc-bed0-27cff1cbbbaf | | nodejs-7xzhf-0d3 | Running | 69be5c29-9f02-41dc-bed0-27cff1cbbbaf |
Section Case Study: Adore Beauty
Adore Beauty is Australia’s longest running and most successful online beauty store, recently leading the ASX’s largest IPO of the year. The team at Adore Beauty place a strong priority on continuous digital innovation in order to deliver an exceptional customer experience and personalized shopping experiences. They run a customized e-commerce back-end with heavy integration with Section to leverage optimum caching performance with the goal of improving performance and reliability, particularly during the peak holiday sales season.
The team at Adore Beauty is leveraging the Section Node.js edge module to serve their Nuxt app closer to end users. The front-end framework (the Nuxt app) calls a Laravel API, which houses product, category and pricing information. The origin (BigCommerce) handles all other requests, such as shopping cart functions. Using the Node.js edge module in this way has led to lower latency and significant cost savings due to no longer sending all requests back to the central infrastructure.
Read the full case study here.