A long-term study of ransomware found that most of the threats spreading in the wild are not as scary or sophisticated as many of us may have feared according to researchers from Northeastern University and Lastline Labs, both in the US, and Institut Eurecom and Symantec Research Labs, both in France.
There were 1359 samples taken from 15 different 'families' of ransomware collected from 2006 to 2014 and analysed. The study stated: “Our results show that, despite a continuous improvement in the encryption, deletion and communications techniques in the main ransomware families, the number of families with sophisticated destructive capabilities remains quite small.”
Researchers Amin Kharraz, William Robertson, Davide Balzarotti, Leyla Bilge and Engin Kirda presented the paper, “Cutting the Gordian Knot: A Look Under the Hood of Ransomware Attacks” at the 12th Conference on Detection of Intrusions and Malware & Vulnerability Assessment last week in Milan.
In the vast majority of cases, ransomware only attempts to lock the victim's computer or encrypt or delete files using superficial techniques. Only 5.3 percent of the samples studied used file encryption as part of their attack – the balance took the easier route of locking the victim's computer and deleting key files.
As a consequence, stopping ransomware attacks is not as complex as has been reported and by monitoring abnormal file system activity, it would be possible to stop a large number of ransomware attacks, “even those using sophisticated encryption capabilities”.
Even zero-day attacks can be detected and prevented by looking at I/O requests and protecting the Master File Table (MFT) in the NTFS file system, the researchers claimed.
For instance, one mitigation strategy would be to monitor Windows API calls. This is based on the observation that many ransomware samples use Windows API calls. “Those API calls can be used to model the application behaviour and train a classifier to detect suspicious sequences of Windows API calls. This approach is not necessarily novel, but it would allow us to stop a large number of ransomware attacks that are produced with little technical effort,” the paper said.
Another mitigation strategy takes advantage of the fact that ransomware causes significant changes in file system activities such as a large number of similar encryption and deletion requests. “By closely monitor[ing] the MFT table, one can detect the creation, encryption or deletion of files. For example, when the system is under a ransomware attack, a significant number of status changes occur in a very short period of time in MFT entries of the deleted files,” they said.
“Encrypted files create a large number of MFT entries with encrypted content in the $DATA attribute of files that do not share the same path,” they added, making it possible to train a classifier to recognise this activity.
“Unlike recent discussions in security community about ransomware attacks, our analysis suggests that implementing practical defence mechanisms is still possible, if we effectively monitor the file system activity for example the changes in Master File Table (MFT) or the types of I/O Request Packets (IRP) generated on behalf of processes to access the file system. We propose a general methodology that allow us to detect a significant number of ransomware attacks without making any assumptions on how samples attack users' files,” the researchers concluded.
Many experts within the security community were not surprised to hear that many ransomware packages were simple in action but questioned how easy it would be to implement the researchers' proposals.
“While in theory ransomware maybe simple to reverse, the reality is very different, and organisations should be wary of letting their guard down. Businesses simply can't afford to be complacent or allow themselves to fall into the trap of thinking that ransomware is not an aggressive form of attack,” said Thierry Karsenti, technical director at Check Point.
However, another expert said the average computer ecosystem was all-too-vulnerable to attack. “The ease of infection just goes to show how vulnerable the average client computer ecosystem is and how antivirus typically isn't up to the task of detecting all of the malware variants,” said Chris Wysopal, CTO and CISO at Veracode. “The ideas in the paper for developing counter-measures that look at anomalous behaviour instead of detecting the behaviour of specific malware families is a good one. It is easy for attackers to keep varying their technique until they bypass the malware scanners.”
Ben Johnson, chief security strategist at Bit9 + Carbon Black observed that ransomware varies widely in its effectiveness. “There are some quite sophisticated ones out there but there are also some very poorly designed ones as well,” he said. “For example, the ransomware Rombertik, which made headlines a few months ago, is quite rare in that it can detect when it is being analysed by security systems and works to cover itself as it gets into the system. Rombertik attempts to generate huge amounts of noise as distractions for cyber-defence teams and access the system in the meantime.”
Stuart Hatto, EMEA field product manager, Cisco Security, said he is becoming increasingly concerned about the amount of malware that is managing to bypass mainstream anti-virus products. “The fact is that even relatively unsophisticated malware is evading layers of defences and still making its way to the end user, and then evading the last line of defence, antivirus, the very software people rely on to protect them against this attack,” he said.
He also felt that the researchers should have used a more modern operating system. “Windows XP SP3 has been available since May 2008. Modern malware creators have to work much harder to bypass OS mitigation techniques in more up to date operating systems. But many users and many enterprises have still not upgraded from XP and so the research experiment has validity,” he said.
Michael Sutton, chief information security officer at Zscaler, warned that contrary to the researchers' proposal, there is no silver bullet for dealing with ransomware. “Doing so could interfere with legitimate applications with similar features as deleting and encrypting bulk files on the hard drive is not uncommon,” he said. “As security vendors begin to implement host-based technologies to identify and prevent ransomware behaviour on a PC, the attackers will evolve their techniques to bypass the controls, perhaps by encrypting or deleting only small batches of files at any given time.”
Adam Tyler, chief innovation officer at CSID, said he was surprised that quite a few of the samples being analysed were not being actively used. In all, only six of the variants listed were used in 2014, he said – specifically Crytolocker, CryptoWall, Reveton, Tobfy, Urausy and Filecoder. Of these, three used file encryption, or 50 percent of the active sample group.
He also disagrees with the researchers that ransomware is not very sophisticated. “As with most parts of the underground world, there is a huge focus on implementing new functionality on an ongoing basis, to both bypass the increasing growth of security applications as well as increase its ease of use and accessibility to new markets,” he said, adding that “Ransomware is continuing to evolve at an astonishing rate”.
Kevin O'Reilly, senior consultant at Context Information Security, also questioned the sample set. “I think that the conclusions drawn from the study are highly dependent on the sample set, and I was not convinced that the set used for the study was representative of ransomware that is recent or notable,” he said.
O'Reilly also said the focus on monitoring the file system to detect attacks wasn't novel or useful. “The reason this idea hasn't made it into actual security products is that it is too vague, intrusive and prone to false-positives. There are many applications for file encryption or other file-intensive activities that might trigger such a detection that are legitimate. At the enterprise level, this would cause potential headaches for system administrators, and for the consumer the ability to successfully differentiate between benign and malicious file system activity that might be alerted upon is fraught with complication,” he said.
He added: “Of course, security products have room to improve on their ability to detect and prevent such malware, but the way in which they will do so will be far more involved and nuanced than this paper might suggest.”
Kyriakos Economou, vulnerability researcher at Nettitude, warns that the proposed monitoring system might have unintended consequences: “The main challenge in providing countermeasures, and hence early detection of some types of malware infection, is to have a reliable distinction between benign and malicious activity. Having in mind specific behavioural patterns to look for is a good start. However, depending on how your monitoring system works, a false positive might cause a serious denial of service issue by blocking legitimate programs from accessing the necessary resources.”
Gavin Reid, VP of threat intelligence at Lancope, said even sophisticated malware had been shown to cut corners. “Cryptowall 2 was meant to employ unbreakable asymmetric RSA encryption but instead used symmetric AES which security researchers were able to decrypt,” he said. “The newer Cryptowall 3 has addressed that and uses the unbreakable asymmetric encryption. Other Cryptowall lookalikes, like TeslaCrypt, still use symmetric encryption and have been broken.”
Tom Court, cyber crime researcher at Alert Logic, said cyber-criminals rarely go after hard targets. “Sadly, there are enough potential victims making themselves easy targets by running outdated systems and software that a mass-market, lowest-common-denominator approach is more than adequate for running a profitable criminal enterprise,” he said.
Amichai Shulman, CTO of Imperva, made the point that simple attack methods often work. “I do think that most of the attacks described as ‘advanced' and ‘sophisticated' are actually very simple and the adjective is used as an excuse for failure to mitigate them. We must remember that while an attack might not be sophisticated, it is not necessarily easily detectable. In fact the simpler the attack is the harder it is to detect it,” Shulman said.
Pat Clawson, CEO of the Blancco Technology Group, said there are good solutions being developed by the cyber-security industry, but consumers have little visibility or knowledge of them. “That's the big issue – there's a lack of education and awareness. And that oftentimes turns into apathy and indifference, which is where most consumers and businesses go horribly wrong,” he said.
Wieland Alge, VP and GM of EMEA at Barracuda Networks said it might appear to be easy to develop protective measures but it's unlikely to be that simple. “If my business was ransomware then I would have a strategy to exploit easier methods first and switch to more sophisticated ones in my pipeline. However, a techno-driven blackhat guy of old would reveal his masterpiece immediately. A purely business-driven organisation would exploit the simpler methods as long as they work and increase the sophistication level step by step,” Alge said.