This site uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Find out more.X

Botnets create up to 80,000 daily queries on search engines

Share this article:

Botnets are being used to generate results on search engines in huge numbers.

With detection of almost 80,000 daily queries being made on some subjects, this allows for a filtered list of potentially exploitable sites to be created in a very short time with minimal effort. Also, as searches are conducted using botnets and not the hacker's IP address, the attacker's identity remains concealed.

Amichai Shulman, CTO of Imperva, said that hackers have become experts at using Google to create a map of hackable targets on the web and this allows them to be more productive when it comes to targeting attacks.

"These attacks highlight that search engine providers need to do more to prevent attackers from taking advantage of their platforms,” he said.

Imperva said that search engines deploy detection mechanisms, based on the IP address of the originating request in order to block automated search campaigns. However hackers easily overcome these detection mechanisms by distributing their queries across botnets.

According to Imperva, its Application Defense Center observed a specific botnet examine dozens and even hundreds of returned results using paging parameters in the query during May and June.

This resulted in almost 550,000 queries (up to 81,000 daily queries and 22,000 average daily queries) being requested during the observation period. The attacker was able to take advantage of the bandwidth available to the dozens of controlled hosts in the botnet to seek and examine vulnerable applications.

In terms of recommendations for search engines, Imperva said that search engine providers should start looking for unusual suspicious queries, such as those that look for known sensitive files or database data files. It also recommended blacklisting internet service providers that are suspected of being part of a botnet and to apply strict anti-automation policies (using CAPTCHA) or identify additional hosts that exhibit the same suspicious behaviour pattern to update the IPs blacklist.

Share this article:
close

Next Article in News

SC webcasts on demand

This is how to secure data in the cloud


Exclusive video webcast & Q&A sponsored by Vormetric


As enterprises look to take advantage of the cloud, they need to understand the importance of safeguarding their confidential and sensitive data in cloud environments. With the appropriate security safeguards, such as fine-grained access policies, a move to the cloud is as, or more, secure than an on-premise data storage.


View the webcast here to find out more

More in News

StubHub ticketing agency taken for a million pounds

StubHub ticketing agency taken for a million pounds

Police around the world have arrested seven people - thought to have tied into an international fraud ring - that allegedly defrauded the eBay-owned StubHub online ticketing service of around ...

DDoS attacks grow as first DIY kits emerge

DDoS attacks grow as first DIY kits emerge

The latest report from Akamai Technologies has revealed another increase in DDoS attacks and the resurgence of botnets to carry out server-based attacks.

WordPress plugin flaw opens blogs up to cybercriminals

WordPress plugin flaw opens blogs up to cybercriminals

A WordPress plugin called MailPoet - which has been downloaded around 1.7 million times - has placed large numbers of WordPress-based websites at risk of incursion.