Some of the questions we get often at Section are “What counts as a page view?” and “How does bot traffic impact my website?” At Section and most other Content Delivery Networks, a page view is counted as any page that serves an HTTP response with a status code of 200 (meaning the page has been correctly delivered) and content type matching text/html. Importantly, this includes bot traffic that is not typically counted in the page view statistics for Google Analytics or other metrics services:
A study in 2014 found that bots account for 56% of all Internet traffic, and at Section we have found that especially for smaller sites bots can account for 50-75% of total traffic.
Google Analytics and bot traffic
So why the discrepancy in page view numbers? It has to do with how Google Analytics counts a page view versus how your website server and Section’s CDN see traffic. Google Analytics works by inserting a JavaScript snippet into the header of your website. This snippet counts a page view whenever a visitor triggers that JavaScript, and most bots do not process JavaScript.
“Good bots” such as those used by Google themselves to crawl and index your site for SEO purposes, will follow directions you give them in your robots.txt file to crawl pages and will not send data to Google Analytics or any other tracking that uses JavaScript.
However, even those bots which do not trigger the JavaScript on your pages are still being served the same HTML document and subsequent assets a real user gets. Because your website server or CDN does the same amount of work to generate that HTML document, bots are counted in the page views for Section and other CDNs.
How to manage bot traffic with headers and caching
These good search engine bots are beneficial for your site: It’s been shown that site speed is an element of SEO, so serving them content quickly is good, and you want them to index your site correctly so you rank for the keywords on your pages. However, small websites that have a large number of pages (for example, ecommerce sites with many individual product pages) may balk at the percentage traffic they are getting from bots when it impacts price of hosting or Content Delivery Network services. In the below site example, you can see that the top 2 browsers are both bots, and 6 of the top 10 browsers are bots.

To manage the amount of bot traffic your website receives, we recommend using some of the strategies outlined in this article on using headers to improve crawl efficiency. By using cache control, you can tell bots if a document has been modified since their last request. You can also set long expiry times on resources that do not change much.
Questions about how many page views you have or how to speed up and secure your site for better SEO? Contact a member of our team today and we’d be happy to help.