Have you ever found it difficult to determine exactly where your traffic is coming from? Website owners nowadays say that determining which visits are from actual humans, and which are just bots, spiders or scrapers, is a difficult and tedious task.

What are Bots?

According to an annual report from Cnet, more than 61% of all web traffic was generated by non-humans. Below is an illustration of where web traffic was derived from in 2013:

feautre-1 Google Analytics Introduces the New Bot and Spider Filter

bot-stats Google Analytics Introduces the New Bot and Spider Filter

Image Source: cnet.com

Even though this data can seem scary, it is important to remember that there are also such things as good bots. These kinds of “good bots” are usually just spiders crawling your site, in order to organise and index your content.

However, with good always comes evil! Malicious bots, otherwise known as spammers, hackers, scrapers or impersonators, are something every site owner should be looking out for. These bots are looking to cause chaos! These bots can lead web publishers into a false sense of security, making them believe they are attracting a good number of visitors, when in actual fact, the vast amount of their visits are from nasty little spam bots.

Spam bots tend to scan online forms in order to use the information for targeted adverts. Likewise, hacker bots can bypass URL security and encryption techniques, download confidential information and start to produce fraudulent content such as fake form submissions or spam comments.

Google’s New Filter

Luckily Google, being the superhero it is, has brought out a brand new filter in order to combat the effects of malicious bots. On July 30th, Google announced a new analytics filter that will help sites to sift through traffic and determine what is human and non-human, meaning that there will now be much less confusion over the amount of unknown traffic driven towards a website.

Once you have checked the box (I will demonstrate below), this new feature will automatically filter all spiders and bots that are listed within the IAB/ABC international spiders and bots list. The IAB/ABC list is comprised of thousands of bots continuously added to since 2006, and membership typically costs around £9,000 per year. However, with the new filter feature, Google is allowing use of this list for free.

So How Do We Use This New filter?

To make use of this new filter follow these steps:

–          Click though to your admin settings in Google Analytics

–          Under the “view” panel you will find “view settings”, click on this

–          Located near the bottom of the page, you will find “bot filtering”

–          Check the box that reads “exclude all hits from known bots and spiders”

–          Your data should now have malicious bots excluded from the results

bot-filtering Google Analytics Introduces the New Bot and Spider Filter

Google has not yet perfected the new filter, and due to the sheer amount of bad bots out there, you will not be able to see which bots are visiting your site. Likewise, you will not be able to filter specific bots out.

As well as this, it is important to note that the IAB/ABC list is constantly updated with new bots, therefore there may still be a few that slip through the radar. However, you should still be able to see a great deal of difference in your figures from using this new tool.