This site uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Find out more.X

Botnets create up to 80,000 daily queries on search engines

Share this article:

Botnets are being used to generate results on search engines in huge numbers.

With detection of almost 80,000 daily queries being made on some subjects, this allows for a filtered list of potentially exploitable sites to be created in a very short time with minimal effort. Also, as searches are conducted using botnets and not the hacker's IP address, the attacker's identity remains concealed.

Amichai Shulman, CTO of Imperva, said that hackers have become experts at using Google to create a map of hackable targets on the web and this allows them to be more productive when it comes to targeting attacks.

"These attacks highlight that search engine providers need to do more to prevent attackers from taking advantage of their platforms,” he said.

Imperva said that search engines deploy detection mechanisms, based on the IP address of the originating request in order to block automated search campaigns. However hackers easily overcome these detection mechanisms by distributing their queries across botnets.

According to Imperva, its Application Defense Center observed a specific botnet examine dozens and even hundreds of returned results using paging parameters in the query during May and June.

This resulted in almost 550,000 queries (up to 81,000 daily queries and 22,000 average daily queries) being requested during the observation period. The attacker was able to take advantage of the bandwidth available to the dozens of controlled hosts in the botnet to seek and examine vulnerable applications.

In terms of recommendations for search engines, Imperva said that search engine providers should start looking for unusual suspicious queries, such as those that look for known sensitive files or database data files. It also recommended blacklisting internet service providers that are suspected of being part of a botnet and to apply strict anti-automation policies (using CAPTCHA) or identify additional hosts that exhibit the same suspicious behaviour pattern to update the IPs blacklist.

Share this article:
close

Next Article in News

SC webcasts on demand

This is how to secure data in the cloud


Exclusive video webcast & Q&A sponsored by Vormetric


As enterprises look to take advantage of the cloud, they need to understand the importance of safeguarding their confidential and sensitive data in cloud environments. With the appropriate security safeguards, such as fine-grained access policies, a move to the cloud is as, or more, secure than an on-premise data storage.


View the webcast here to find out more

More in News

Belgacom says alleged GCHQ APT attack cost firm £12 million

Belgacom says alleged GCHQ APT attack cost firm ...

One year on from a nation-state APT which infected 26,000 machines across 124 systems at telecom operator Belgacom and the firm has detailed the cost and manpower involved in the ...

CryptoWall compromises 40,000 UK citizens

CryptoWall compromises 40,000 UK citizens

Research just published claims to show that ransomware - in the shape of CryptoWall - is still generating healthy volumes of income for the cyber-criminals behind the code.

Microsoft pulls Windows 7 and Windows Server 2008 elements of Patch Tuesday

Microsoft pulls Windows 7 and Windows Server 2008 ...

Microsoft has unexpectedly withdrawn a key element of its Patch Tuesday operating system refresh after discovering a flaw in an update for Windows 7 and Windows Server 2008.