Hackers said to target and wipe data on insecure Hadoop installations

News by Rene Millman

Big data on Hadoop Distributed File Systems appears to be in big trouble as attackers wipe poorly configured databases.

Hadoop Distributed File System installations have been attacked by hackers resulting in many systems having all data wiped from them.

According to a blog post by Fidelis Cybersecurity Threat Research, the firm observed attacks on Hadoop infrastructure in a similar manner to internet-exposed MongoDB and Elasticsearch databases, which were held to ransom by criminals.

The firm said that there were no attempts to claim a ransom or any other communication – the data was simply deleted and that directory name was left as a calling card. It estimated that the potential exposure of this attack is around 8000-10,000 HDFS installations worldwide, but precise numbers are difficult to determine.

“A core issue is similar to MongoDB, namely the default configuration can allow ‘access without authentication.' This means an attacker with basic proficiency in (Hadoop Distributed File System) can start deleting files,” said the firm in a statement.

On or around 5 or 6 January, traffic to port 50070 soared as attackers scanned for open HDFS installations to target, it said. Port statistics from the SANS Internet Storm Center and the Qihoo 360's Netlab show a significant spike in traffic when this attack occurred on January 5-6. Qihoo shows this almost exclusively from a single Chinese IP of

“However, it's important not to jump to conclusions about the attacker's location simply by looking at an IP address. Attackers use infrastructure all over the world to hide their identities. Coincidently, the second highest scanner is adjacent to our suspect,,” said the firm.

A scan using Shodan showed that many installations also lack authentication. Fidelis said that it is unclear what the motivation of the attacker is, “but it seems like this was an intentional ‘security awareness training' exercise, albeit a criminal one”.

Javvad Malik, security advocate at AlienVault, told SC Media UK that organisations should look at their public-facing infrastructure and not be drawn into a game of whack-a-mole.

“Otherwise they'll be trying to secure MongoDB one week, Elasticsearch another, and Hadoop the next,” he said.

“For any public-facing system, enterprises should look to harden security. While most systems don't have strong security controls enabled by default, there are usually many built-in features that can be turned on. Limiting access to the admin console from the internet, changing default passwords, using encryption, having monitoring and threat detection controls other such measures can all help to reduce the likelihood and impact of an attack.”

Mark James, IT security specialist at ESET, told SC that as with any database that's storing information, security should be of utmost importance.

“So many breaches or attacks happen because basic security measures are not in place. In many cases these are fairly simple and only require knowledge of the attack methods being used and then working through those processes combating each one in turn, it is usually just down to awareness and time,” he said.

“For this particular attack avoid having HDFS on public facing connections and if that's not possible then ensure your authentication is enabled using built-in methods. Also, remember that no authentication is required by default so if not properly secured anything connecting from the internet could grant them complete access to your data. You should also regularly monitor your software and look for suspicious or “out of the ordinary” occurrences. When dealing with any attack it's always better to be pro-active rather than re-active.”

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Video and interviews