Varnish Cache is a high performance HTTP Accelerator, or HTTP cache. It is configurable as a reverse proxy for an HTTP server and increases the performance of websites by 300 - 1000x. This can be useful in so many situations for a modern web site.
Initially, it can seem that Varnish Cache is quick and easy to install, but in today’s devops focused world there’s much more than installing and programming VCL (Varnish Configuration Language) to get a good hit ratio in Varnish Cache.
Developing and Testing with Varnish Cache
How are you going to run Varnish Cache locally, where you are writing your application? To get a great result with Varnish Cache you need to tailor your VCL to meet the design of your application. Trying to introduce Varnish Cache in later phases of the development lifecycle, such as only using Varnish Cache in testing and production environments, usually gives a suboptimal result.
Running Varnish Cache
How are you planning to size your cache and manage the resources required by Varnish Cache? On lower memory systems Varnish Cache can mean that Disk IO becomes a concern. Are you able to meet the needs of your cache as your application changes?
Monitoring Varnish Cache
How will you know if Varnish Cache is working well? Perhaps you can SSH onto a cache server running Varnish Cache and use Varnish Cachetop or Varnish Cachestat. Or perhaps you are could build a Varnish Cache metrics infrastructure that does log shipping of the Varnish Cachencsa logs to Logstash, Statsd and Graphite. From there you could build the dashboards that you’ll need to monitor during a continuous delivery system.
Debugging Varnish Cache
When something in your monitoring shows that you’re not caching as expected, how will you know what Varnish Cache is doing? Perhaps you’ll SSH into a Varnish Cache Server and run Varnish Cachelog. You might be able to do this in your development environment, but you might also need to run that in your production environment. If you have load balanced, highly available Varnish Cache servers, you might need to run Varnish Cachelog in production.
Improving Varnish Cache
How will you know what is the next big thing to cache? A Varnish Cache metrics system will provide you with a console that you could use to identify the thing that is causing a cache miss in Varnish Cache.
Securing Varnish Cache with HTTPS, TLS and SSL
How are you going to wrap HTTPS around Varnish Cache? You’ll probably need some SSL/TLS termination proxy in front of Varnish Cache, such as Nginx. Also, you’ll need to consider the security between Varnish Cache and your application - perhaps that connection needs to be secured also.
Error Handling and Varnish Cache Guru Meditations, the 503 Service Unavailable error
How are you going to handle errors in Varnish Cache? You might want some custom error pages or other error handling to ensure that your customers get a decent experience when your backend is unavailable. You may also want to show pages to users when you undertake maintenance on your backend
Your logs hold a lot of valuable data, and unfortunately they are often removed by logrotate or something similar. Getting value from your logs requires analysis tooling, like the ELK stack, Elasticsearch - Logstash - Kibana. Here’s where you can search your logs to tell your boxx or customer why they got that Guru Meditation error.
Varnish Cache Alerts
You may need to get alerts when your Varnish Cache performance degrades, or when Varnish Cache starts to throw errors.
What will you use to manage changes to Varnish Cache? You might us git to source control your VCL files, and then use Jenkins and Puppet or Chef to manage the deployment of those changes at the right time.
As you can see, Varnish Cache is an amazing HTTP Accelerator, but there’s more than just running apt-get install Varnish Cache to get a great result for your application.