Managing Varnish Cache Hit Rate During Deployment

As you work on new features in your application it is important to keep your Varnish Cache VCL code up to date with your application’s needs.

In a well developed web application caching is a really important component of the application. Delivering objects quickly to a browser is one of the most important aspects of web page performance and user experience.

Also, smart use of caching means you can run much smaller infrastructure and serve more users. Smaller infrastructure is usually easier to manage and costs less to run.

But consider where you get to the point where your application is dependent on the caching to work effectively. You do not want to release your software to production and find that your infrastructure cannot handle a normal workload, due to the caching rules becoming out of date.

For example, you may have your site’s home page cached in Varnish Cache. If a developer inadvertently changes the cacheability of the page then your application server might suddenly need to dynamically generate the home page for each user request.

One of the reasons this occurs is because the developers work in isolation from the caching system. They run the application without the reverse proxy tier on their local machine, and test their work without the proxy servers being involved.

Teams overcome this by deploying the proxies in testing and staging environments, which is much later in a typical development flow. Round trip time on bug detection and fixing increases greatly

It is also difficult to run the reverse proxy stacks in the local development environment due to configuration complexity and other other compatibility reasons like trying to run Linux proxies on Windows.

Tools like Vagrant and Docker make it possible to run your entire reverse proxy stack on your local development machine. In this setup you are able to make changes to your application, test the application directly, and then test it again on your computer with the proxies in place.

Using a configuration management system to ensure that the Varnish Cache configuration is consistent between your local machine, your test environment and your production environment also increases certainly and reduces stress during deployment.

You can also integrate the testing the cacheability of objects into your automated testing and CI/CD workflow. You can run tests against the environment and query a metrics or log management platform to ensure that you’re heading in the right direction before you make that final leap to the production environment.

Similar Articles