Around 5307 Hadoop clusters are exposed to the internet with their security settings off, leaving them open to attack by hackers.
That's according to GDI Foundation security researchers Victor Gevers, Niall Merrigan and Matt Bromiley who say the clusters are available online with security and safemode settings off.
“The default installation for HDFS Admin binds to the IP address 0.0.0.0 and allows any unauthenticated user to perform super user functions to a Hadoop cluster,” said the researchers in a blog post.
“These functions can be performed via a web browser, and do not prevent an attacker from destructive actions. This may include destroying data nodes, data volumes, or snapshots with TBs of data in seconds.”
As reported by SC Media UK last month, Hadoop Distributed File System installations have been attacked by hackers resulting in many systems having all data wiped from them. Attacks appeared to target traffic to port 50070.
The researchers have recently identified cases of vandalism and destruction that abuse unauthenticated databases.
“Our observations have shown attackers moving through various database types, such as MongoDB and Elasticsearch,” said the researchers. “We predict it will not be long before weaknesses in HDFS will be exploited by the same types of attackers that are launching ransom attacks and wiping data.”
They observed that while attackers demand ransoms, they do not back up data prior to wiping it. “Victims who have paid ransom prices have not received data in return, and are often left without a means to recover,” they warned.
The researchers advised Hadoop administrators to ensure security and safe mode are switched on, as well as turning off service-level authentication. They said that Kerberos authentication should be enabled and apply network filtering or let firewall rules block port 50070 to untrusted IPs.
Administrators should also add a free IAM control and network segmentation with an OpenVPN solution and implement reverse proxy, such as Knox, to aid in preventing unauthorised access and manage connectivity to Hadoop.
Alex Mathews, lead security evangelist of Positive Technologies, told SC that the main advice is that organisations shouldn't allow internet connections to systems running on Hadoop Distributed File System (HDFS).
“This is because its interface allows unauthenticated access by default. If you do need to allow this type of connection, it's advised to first implement built-in authentication mechanisms,” he said.
He added that the main issue is that ‘usually' a vendor would issue security warnings and advice detailing any product vulnerabilities. However, in the case of Hadoop, there isn't a clearly written list of security recommendations – neither from the vendor nor from well-known security organisations.
CIS is in the process of creating such standards, and SANS last issued recommendations for Hadoop in 2013. The only way to know about current Hadoop vulnerabilities is to ‘hang out' in professional web-forums, constantly, but few have time to do that.
“There are a few things that can be done to minimise risks, such as conducting regular vulnerability analysis for the entire corporative information system, as this will identify not only database vulnerabilities but also any other attack vectors that may result in a DB breach – for example, some web server vulnerabilities can lead to unauthorised DB access. Another recommendation is to perform regular backups as ransomware is proving a popular way to monetise DB attacks,” he said.
John Farebrother, regional manager UK and Ireland at Stormshield, told SC that it is a feature of MongoDB, Elasticsearch and Hadoop, that they operate in Big Data, typically as clusters or distributed implementations.
“Communications is often over well-known and public ports, and many implementations are simply installed by wizard using defaults. Designing an implementation with security in mind from the start – implementing custom ports and strong passwords – is the simplest and most effective solution to 90 percent of the issues,” he said.