Hotjar Saves 456 Person-Years per Month with Section
November 29, 2018
How Hotjar Leverages Edge Compute
When it comes to edge compute services, the bulk of the Hotjar workload is static file delivery. Each visitor click on its clients’ sites beacons back to the Hotjar central infrastructure, which is a mix of AWS, Python and NGINX/Lua (a self-built module).
On average, Hotjar handles 1.5 petabytes of traffic per month, serving around 80 billion requests on a monthly basis.
With such high volumes of data passing through their infrastructure, Hotjar relies on an edge provider for low latency so that it can provide customers with optimal load times. When Hotjar approached Section, they were seeking to improve both outright delivery speeds and the consistency of those delivery speeds.
Hotjar has a small, but mighty DevOps team (three members out of an overall employee count of 70), so automation is vital to their operations. In addition to improving performance and consistency, the team was also looking for ways to automate more of their workflows and better integrate these functions into their DevOps processes.
“With our previous CDN provider, the main problem was that they had very long tail latency. Some of our customers, not many, but some were occasionally experiencing 30 second load times, which was completely unacceptable. That was what we were primarily looking to get rid of. Our previous CDN ultimately didn’t have the level of insight or the control that we needed.” says Paul.
Hotjar’s DevOps team started exploring new CDN providers, first assembling a long list of potentials, beginning with those they had worked with in the past, and then researching others previously unknown to them. The pruning process, which looked at latency, customer service and price analysis, took the selection process down to a handful of CDNs for comparison.
Hotjar then conducted a competitive performance assessment for static asset delivery, using New Relic, to benchmark performance across each provider’s platform. After Hotjar switched its traffic over to Section, load times declined by approximately 50%. At the end of the bake-off period:
- Section’s Edge Compute platform outperformed the legacy CDNs, and
- The performance delivered was much more consistent.
“We calculated how much overall time we’re saving our users (by calculating the difference in average latency) per month, and it works out to approximately 456 person-years per month,” - Paul Kirby, DevOps Engineer, Hotjar
“Further to that,” he adds, “was the level of contact we’ve had from the Section team. Most other companies who I’ve worked with on a vendor relationship side of things have not been very hands-on - you have to get through the Tier 1 support to get to anyone who knows what they’re doing… The Section team, by contrast, has been a great help and extremely responsive. It’s felt much more like working with another team in our company, as opposed to working with another company altogether.”
A final critical aspect for Hotjar moving onto the Section platform was Section’s developer-led approach. None of the legacy CDNs offer the kind of developer workflows that Section does, which make edge programming a reality.
By offering DevOps control, the Section platform enables developers to make changes in their development environment and test in staging before pushing updates out to production. Section also offers the ability to integrate the Section configuration into clients’ application code.
Containerization was another attractive feature for Hotjar, who hope to utilize the possibilities and flexibility it offers as they look ahead.
“I really like that Section views things from a developer perspective… added to that, having a Docker-based platform means that there are a lot of options for different things you can plug and play into your environment.” says Paul.
“Just the fact that it is a Docker-based platform means we can create our own containers that can be thrown in there. There’s an opportunity to run more advanced workloads at the edge that doesn’t exist elsewhere.” -Paul Kirby.