Blog, Events & News
Part Four: A Gap in Understanding and Responsibility
By Sabrina / 14th Sep 2020
So far in our blog series, we have discovered that there is a high understanding of the threats that bots cause across all industries. We recently carried out a survey of 200 UK enterprises across e-Commerce, financial services, entertainment and travel. In part three of our blog series, we discussed how bots are affecting different industries.
There is very little need to explain to businesses that bots can be a problem. Not only do they know that they exist, but businesses know that a threat is out there and they know the harm that can be done. However, our research shows that many businesses don’t fully understand the risks, the level of bot activity, and how bot activity can be properly mitigated.
Are bots only a threat to websites?
To help us further gauge the respondents understanding of bot activity and where that traffic exists, we asked businesses what parts of their online estate they perceive to be most at risk. Websites are seen as the main target, with application programming interfaces (APIs) at the least risk across all industries we asked. Travel, e-Commerce and financial services all rank their website as the most likely to suffer an attack. However, entertainment businesses, identify their mobile app as most likely to suffer a bot attack.
Many businesses rank mobile and website as about as likely as each other to suffer from a bot attack, with APIs coming in third. This could be due to a lack of available APIs, but it is much more likely to be suggestive of a lack of awareness, visibility or thought around bots hitting a business’s API.
Our own internal research has seen a rise in the targeting of APIs and mobile apps. If bot mitigation on a website is excellent but lacking when it comes to the mobile app and APIs, cyber criminals will easily switch their attack target.
The survey results are suggestive of businesses’ current understanding of how bots use their web application resources.
Businesses underestimate the quantity of bot traffic
Businesses believe that around 15% of their web application resources are taken up by bots. This is a low estimate. There are a number of reasons for this. It could be due to only catching self-identifying bots or by using ineffective tools and methods of detection.
A Netacea client – a global sportsbook organisation – was able to drastically reduce the consumption of its web-facing infrastructure when Netacea successfully stopped scraper bots on their website.
A lack of visibility and ownership
Overall our findings show the difference between what businesses think they know and what they really know.
Businesses are confident in their bot mitigation, but the stats don’t support this, they reveal that what they have is not enough. A lack of visibility, understanding and ownership means they are losing sight of attacks, and therefore being exposed to greater threats.
It’s critical that businesses understand the problem of sophisticated bots. Without a deeper understanding of how bots work, businesses are risking taking big financial hits and losing their customers’ loyalty.
Organisations need fast and accurate visibility of all traffic to their website, mobile apps and APIs to effectively identify and stop malicious bot activity. This can be achieved with server-side bot detection. Netacea’s pioneering approach to server-side bot management analyses millions of requests to produce a constant stream of real-time recommendations.