Ask us about our complimentary Data Assessment to clean and enrich product data, improve conversions, avoid cart abandonment, and enhance customer experience. Learn more now!

Bots have been a constant thorn in the side of most companies operating online for several years now. With the recent drastic change in how we sell and interact with customers (as online channels in many cases are the only avenues through which to operate) bots have become more noticeable and can affect your KPIs in significant ways if not mitigated.

Our teams have developed a number of tactics to manage bot traffic and bad behaviour, and mitigate the impact of these bots – this post uses that knowledge to create a great starting point.

 

What are bots?

A bot (short for “robot”) is a program that operates as an agent for a user or another program, or simulates a human activity. There are lots of different types of bots, some malicious and some not.

Good Bots are from well known companies like Google, Pinterest, Yahoo, and Bing. These bots collect information from your website to power their service. They are crucial to your online presence and search rankings.

Bad Bots, on the other hand, include comment spammers, SQL Injection worms, vulnerability scanners, content scrapers and more.

 

But why are bots a problem?

Bot traffic—including scrapers, hackers, spammers, impersonators—has been estimated to be as high as 61 percent of all internet traffic. We have seen levels from 30% to 90% in some of our client environments.

‘Bad’ bots can steal data or even take a site down. In fact, more than 95% of all website attacks are carried out by malicious bots. Even benign bots can cause problems by using precious system resources. If bot traffic is not taken into consideration when projecting traffic patterns the environment may be under scoped.

Because of the COVID-19 health crisis, many organizations are relying on their online channels as their only source of revenue. If that revenue is impacted in any way, by overloading system resources, creating security risks that compromise user data, or other forms of attack, then revenue stops.

 

So what can be done about them?

There are several ways to manage bot traffic and its potential impact:

  • Monitor bot activity – if you don’t know it is happening you can’t mitigate against it. There are numerous tools on the market to monitor bot activity over time – the best tools help identify bots using many different rules and models.
  • Leverage a WAF (Web Application Firewall) to block bot traffic. If the client has a CDN, most suppliers offer this functionality.
  • Serve different content to bots; you can display a less resource-intensive site and protect your assets.
  • Deploy a separate server that just handles bot traffic, separating it from user traffic. This means if bot traffic is negatively impacting the site, your users are not impacted because they are on another server.
  • Lower the webserver session limits for bots. For example, if the timeout value for a session is 30 minutes, set the session limit to 5 minutes for a bot, terminating the sessions faster.
  • Scale back Google’s crawl rate.
  • “Tarpitting”, which means deploying clever bits of code against a bot to force it to use more CPU – resulting in unprofitable operating costs for its creator. Our partner Cloudflare has a great post where they outline their reasons for tarpitting here, as well as how they counteract the environmental impacts of this clever technique.

 

If your site is experiencing bot traffic, hopefully the above tips will help. 

Right now, the actions we take to support online commerce channels are crucial. If you need help managing bot traffic, contact us today. We specialize in supporting ecommerce platforms and work with our clients to ensure performance, security and scalability of their online channels – no matter the challenges we face.