Bots are not inherently good or bad, but they can be used with good or bad intent. Working with web engineers and data scientists who can reveal patterns of traffic behaviour on your website will ensure you have a good insight into what good bots look like vs. bad.
What is a good bot and what is a bad bot
Here are the following identifying characteristics for good bots vs. bad bots:
- Web scrapers, for instance, can be extremely helpful to online businesses and play a vital role in driving highly relevant traffic to the organisation’s website. They helpfully gather large amounts of data from websites, combing through a site’s source code in their hunt for the information they’ve been scripted to locate.
- Search engine spiders are a useful example of a commonly used web scraper with good intent. Search engine spiders crawl websites, pulling together all sorts of relevant information such as copy, headlines, alt tags and product pricing to determine where that site should be indexed in the search engine results pages (SERPs).
- Analytics tracking software which monitors visitors to websites and records their actions. This information is used to help guide the owner of that site in terms of future functionality and content improvements.
- Chatbots / A.I / machine learning software, for example Facebook’s Messenger bot or Google Assistant. These bots are used to automate routine processes and free up valuable time for those organisations that use them, whether they be large brand, small business or even the individual user.
Without these clever good bots, no one would be able to find your website using words and phrases that are relevant to your product or service.
However, it’s important to remember that unlike bad bots, how these good bots crawl your site and which pages you want indexed can be managed by defining your website’s robots.txt.
Bad bots, on the other hand, cannot be regulated. By their nature, they are programmed to cause harm in one way or another. Which is why it’s important to detect bot traffic and behaviour quickly, determine its intent and mitigate bad bots.
Let’s refer again to web scrapers. These bots can very useful to a business, but they can also be extremely harmful.
For example, a competitor might use scraper bots to look at your prices and lower theirs accordingly and driving your potential customers to their own site.
There a range of bad bot use cases that your business must be aware of:
- Defaming your organisation, for instance by publishing embarrassing information on social media.
- Taking over your emails and spamming everyone you know. This is a problem with spambot bots
- Installing viruses and malware onto your site so that it can be used to perform other illegal activities or infect visitors in return for money.
- Hijacking your bandwidth to carry out DDoS attacks. This is when bad bot traffic is used to flood your site with junk requests, overloading the server and ultimately bringing it down.
- Stealing your content or reselling it. This is why you need to publish your content on a .zip file and then password protect it for use with good bots only.
- Posting fake reviews of your business, publishing malicious links to malware sites etc.
Some bad bots are very sophisticated and difficult to detect. These include:
- Content scrapers which steal your copy, such as product descriptions etc., publish it on their own site with links pointing back to their content. They can use this or sell it on other websites like ebay for a financial gain. Websites that steal from you like this are known as ‘content farms’, while the ability of these web scrapers to crawl is known as ‘crawler traffic’
- Black hat link building tools which target your competitors and attempt to manipulate search engine rankings using unscrupulous tactics by creating thousands of new links to your competitors’ website. The more links a site has pointing at it, the higher its position in search engines is likely to be when you use that keyword
- Fake traffic generation tools which create fake pages of your website or of other websites and send visitors there, fooling search engines into thinking that they are visits from real people. This can cause you to rank too highly on Google for keywords which have little value (cost per click) and therefore not converting into real business leads
These are just some of the most common bad bot uses cases. There are many more terrifying things they can do if not identified and mitigated quickly. Removing bad bots involves identifying them first which is where machine learning comes into play.
Machine learning allows you to train an algorithm to understand what good (or bad) bot traffic looks like, allowing you to take action against attacks before they happen. So make sure you are training your algorithm on genuine traffic so that you can identify fake bot traffic when it appears.
Differences between good bots and bad bots
The main difference between good and bad bots is their intent. If you’re using a bot for malicious reasons, then it’s either classified as a ‘bad bot’ or an ‘attack vector’ which is the term used to describe all forms of cyber hacking including phishing attacks, DDOS and other computer network exploitation techniques.
Bad bots can often be defined by their appearance. For example, if your competition have employed web scraping software to extract information about you without your knowledge, they are running a form of bad bot. Another example would be if someone wrote and posted negative comments about your business on social media platforms such as Facebook etc., but in reality these were generated by fake accounts set up just to post malicious comments.
On the other hand, good bots are often used to improve your website search engine performance by indexing all of the content on there. They can also help you spread links around a site and therefore aid SEO efforts. Some good bots are even designed to prevent malicious attacks such as spam emails from ever reaching an inbox for example through anti-phishing filters and similar methods.
For more information visit: