Traffic Throttling Mechanisms

Using Varnich Cache to keep customres happy when there is too much traffic on your site

In website performance optimisation it is easy to focus on improving the speed of the experience of a single visitor but it is just as important to consider the speed and availability of your website as the number of concurrent users grows.

Whether your website has generally high traffic or experiences occasionally spikes of very high traffic, the problem is the same - if your infrastructure cannot handle the load, the experience of every visitor will suffer. The site may become slow for everyone or it may even become completely unavailable.

Typically you will want to invest in the necessary infrastructure changes so that you can give every visitor the best experience under high load conditions but this can take time. If you need an interim solution you may want to consider traffic throttling

What is throttling

Throttling, at least in the context of this article, is concerned with limiting traffic coming from legitimate visitors as opposed to dealing with denial-of-service attacks.

The idea is, that as the number of concurrent visitors grows, instead of the experience of all visitors degrading, you give the best experience to a portion of the visitors, and for the remaining visitors you provide a ‘Sorry, please try again soon’ message. You may have seen something similar if you’ve ever tried to purchase event tickets online on the day the tickets are released.

This way, you are able to provide a quality service to the maximum number of visitors that your current infrastructure can handle, and as those visitors complete their transactions and leave, you can allow the next batch of waiting visitors access to your site.

It is true that the visitors that fall into the ‘try again soon’ bucket may be annoyed and choose not to wait and not come back but given that there are many other visitors still having a great experience, for many site owners this is a reasonable alternative to upsetting all visitors.

It is also possible for the ‘try again soon’ experience to offer simple features like subscribing to a mailing list, or browsing a static catalogue of products.

Sample image showing a user when throttling engaged:

Sample Throttling Image

Choosing what to throttle

Typically, the threshold for throttling is directly connected to the number of concurrent HTTP requests that your website infrastructure can handle at a good level of performance. As such, the simplest mechanism for throttling is to limit the number of incoming HTTP requests. Unfortunately because any given web page load will involve multiple requests for HTML, stylesheet, script, and image resources, it is possible that request throttling could engage in the middle of a visitor’s page load and result in broken images, an unstyled page, or broken behaviour.

The layer of infrastructure used to implement throttling would typically be a load balancer or reverse-proxy positioned between the user-agents and the origin web servers. At this layer, the next simplest throttling mechanism would be to throttle by connection. This gives a slightly better chance of ensuring that a single visitor will not engage throttling in the middle of their page load but still isn’t perfect because browsers today will open several connections for a single visitor to download multiple resources in parallel.

Additionally, if the idle browser connection times-out while the visitor decides whether to proceed, it is possible that the next connection they make (eg to add a product to their cart) will be subject to throttling.

One more possible throttling mechanism worth mentioning is to throttle by IP address. Often visitors inside an enterprise environment or using a cellular network will be proxied through one or more common IP addresses. This can lead to either allowing or throttling a large portion of visitors just by allowing or throttling a single IP address and is not a great option.

To give your visitors the most consistent experience, albeit reducing the correlation to origin requests, a very effective option is to throttle by cookie.

By giving each unique visitor a cookie and choosing which cookies will be allowed and which cookies will be throttled you can ensure that a single allowed visitor will not be subjected to broken resources on a page load, or become blocked midway through their transaction.

Using cookies however can require more capable infrastructure that is able to parse and manipulate the HTTP data - something that many simple load balancers cannot do. A good reverse-proxy, like Varnish Cache{:target=”_blank”} for example, is a good tool for this job.

Server state versus client state

If you decide to using cookies for throttling, there is another decision to be made about where to keep track of which cookies are throttled and which are allowed.

A simple mechanism is to specify in the cookie value whether to throttle or allow. This has very low resource requirements on your infrastructure (which is already under load at this point) but does carry the risk that an intelligent visitor may manipulate the cookie value to grant them access when they otherwise would be throttled. This can be mitigated by encrypting the cookie but comes at the cost of CPU time on your infrastructure.

The other downside this simple approach is that it is difficult to know just how many visitors are allowed or throttled at any given time and means you will probably end up using a percentage threshold to throttle visitors instead of an absolute number.

The alternative is to track the state of a given visitor cookie on your infrastructure. To do this well will typically require a shared, fast, and highly-available data store accessible by all your servers responsible for implementing the throttling. You may prefer to invest the time and money required for this setup on your primary website infrastructure instead.

However, if you can do server-side state tracking for throttling this can enable features like:

  • Throttling to an absolute number of visitors instead of a ratio.
  • Providing a guide to throttled visitors about how long the wait may be.
  • Allowing the throttled visitors waiting longest to access the site first when traffic subsides.

Ultimately, whatever mechanism you implement, throttling should be a short term solution until you can improve your website to handle larger numbers of concurrent visitors.

Similar Articles