Varnish Cache is a high performance HTTP Accelerator, or HTTP cache. It is configurable as a reverse proxy for an HTTP server and increases the performance of websites by 300 - 1000x. This can be useful in so many situations for a modern web site.
Initially, it can seem that Varnish is quick and easy to install, but in today’s devops focused world there’s much more than installing and programming VCL (Varnish Configuration Language) to get a good hit ratio in Varnish.
Developing and Testing with Varnish Cache
How are you going to run varnish locally, where you are writing your application? To get a great result with Varnish you need to tailor your VCL to meet the design of your application. Trying to introduce Varnish in later phases of the development lifecycle, such as only using varnish in testing and production environments, usually gives a suboptimal result.
Running Varnish Cache
How are you planning to size your cache and manage the resources required by Varnish? On lower memory systems Varnish Cache can mean that Disk IO becomes a concern. Are you able to meet the needs of your cache as your application changes?
Monitoring Varnish Cache
How will you know if Varnish is working well? Perhaps you can SSH onto a cache server running Varnish and use varnishtop or varnishstat. Or perhaps you are could build a Varnish metrics infrastructure that does log shipping of the varnishncsa logs to Logstash, Statsd and Graphite. From there you could build the dashboards that you’ll need to monitor during a continuous delivery system.
Debugging Varnish Cache
When something in your monitoring shows that you’re not caching as expected, how will you know what Varnish is doing? Perhaps you’ll SSH into a Varnish Cache Server and run varnishlog. You might be able to do this in your development environment, but you might also need to run that in your production environment. If you have load balanced, highly available Varnish servers, you might need to run varnishlog in production.
Improving Varnish Cache
How will you know what is the next big thing to cache? A Varnish metrics system will provide you with a console that you could use to identify the thing that is causing a cache miss in Varnish.
Securing Varnish Cache with HTTPS, TLS and SSL
How are you going to wrap HTTPS around Varnish? You’ll probably need some SSL/TLS termination proxy in front of Varnish, such as Nginx. Also, you’ll need to consider the security between Varnish and your application - perhaps that connection needs to be secured also.
Error Handling and Varnish Guru Meditations, the 503 Service Unavailable error
How are you going to handle errors in Varnish? You might want some custom error pages or other error handling to ensure that your customers get a decent experience when your backend is unavailable. You may also want to show pages to users when you undertake maintenance on your backend
Your logs hold a lot of valuable data, and unfortunately they are often removed by logrotate or something similar. Getting value from your logs requires analysis tooling, like the ELK stack, Elasticsearch - Logstash - Kibana. Here’s where you can search your logs to tell your boxx or customer why they got that Guru Meditation error.
You may need to get alerts when your Varnish performance degrades, or when Varnish starts to throw errors.
What will you use to manage changes to Varnish? You might us git to source control your VCL files, and then use Jenkins and Puppet or Chef to manage the deployment of those changes at the right time.
As you can see, Varnish is an amazing HTTP Accelerator, but there’s more than just running apt-get install varnish to get a great result for your application.