As a security professional, it's your job to be aware of the kind of threats that your website is vulnerable to. But do you know how much of your traffic is made up of bots and the impact they are having on your web infrastructure and content?
The chances are your site has a bot problem. Unwanted bots impair performance and increase bandwidth and data centre costs, whilst web scrapers are likely to be compromising the security and competitiveness of your site. Analysis of traffic across the Akamai Intelligent Platform™ shows that bots can typically comprise between 40 to 60 percent of an organisation's total web traffic and in some cases less than 10 percent of site traffic was actually legitimate users.
Bots are not all made equal
Of course, not all bots are bad, indeed some are essential, such as those from search engines – they are a key part of making sure your website is ranked well in search results. Others may be from suppliers and business partners who need to access your site as a normal part of doing business. An effective bot management solution should offer the ability to prioritise these ‘good bots', as opposed to risk blocking them and damaging your competitive advantage.
A common response to the bot problem is to blindly block them all, but the solution is never that binary. Having access to a portfolio of customised responses to each variety of bot is the only effective way to maintain competitive advantage and reduce costs. For example, you may want to delay some bot responses to prioritise human traffic, or respond with a stealth-like ‘silent deny', so not to alert the bot operator, or even deliberately give incorrect pricing information to throw off your competition. This is what managing bots is all about: making them work to your advantage, not simply suffering in silence and accepting them as a cost of doing business online. Managing bots effectively will help maintain the integrity of your valuable data and content, reduce bandwidth and infrastructure costs, enhance user-experience and strengthen your competitive positioning.
Scraping the bottom of the barrel
We've mentioned website scraping, but bots can also be designed to ‘grab inventory' of limited supply items, such as tickets for concerts and sporting events. The bot operator can then re-sell them at a premium on their own site, resulting in the original seller losing their relationship and upsell opportunities with the consumer.
Simply denying bots is a mistake, as the bot operator will typically adapt the bot signatures and re-emerge under a different guise. To manage them to your benefit you have to accurately identify them, categorise them and allocate appropriate response decisions for each type - to silently deny, delay the response to prioritise legitimate users during peak times, serve alternative content, or allow. Solutions that simply expect you to allow or deny dramatically underestimate the complexities of the bot problem and often leave security professionals with a huge headache.
The bots are attacking you – protect yourself
The bad bots are attacking your site, so effective security strategies need to be employed to stop them robbing your customers of a great user-experience, and you of content, revenue and competitive advantage.
The evolving nature of ‘bad bots' requires a sophisticated, dynamic solution that combines real-time statistical and behavioural analysis, to identify their changing deception methods and any malicious intent. Static lists of identified bad bots, on the other hand, rapidly become stale and ineffective.
Bad bots are not going to go away, and why would they? Bot operators are always highly driven to succeed, due to the resulting financial gain from allowing their software to exploit your valuable web assets. If you want to win this ‘battle against the bots' it's time to take back control of your web visitors.
Contributed by Alistair Tooth, cloud security director, Akamai Technologies