Why can't Google Play prevent Dubsmash porn clicking malware?

News by Davey Winder

Researchers at cloud-based web security outfit Zscaler have reported that yet another malware infected version of the hugely popular Android video selfie app 'Dubsmash' has found its way into the Google Play Store.

Also in:

The free app allows users to create selfie videos mashed up with music, and has been targeted by malware authors to act as a hidden 'porn clicker' app on numerous occasions over the last few weeks. The latest comes in the form of an app claiming to be 'Dubsmash V3', containing the same malware family, and has already been removed by Google. 

The app deletes its own icon after it has been run and the user quits the application, but then continues to run as a hidden background process where it will be loading web pages and clicking on advertising in order to generate revenue for the people behind it. The latest iteration of the malware app was downloaded around 5,000 times before Google spotted it. This begs the question: why can't Google deal effectively with such a malicious app which is using the same malware family variants and uploading them with similar application names under different developer accounts? It's hardly the act of a criminal mastermind, yet it seems to be having quite some success.

Deepen Desai, a security expert at Zscaler and one of the researchers on the  ThreatLabz team which identified the latest variant of the malware, told SCMagazineUK.com that he is "not sure of the exact weakness in the Google Play Store's vetting process that these malware authors are exploiting here" but the fact that these apps are not stealing any sensitive information from the infected device and are purely engaged in click fraud activity "may be the reason why Google has not been able to flag it." 

Desai also told us that the authors of this malware have "moved away from hard-coded porn URLs inside the APK and instead they are dynamically retrieving them from a remote location at run time" so while the content of the remote location may be completely innocuous at the time of the app vetting process it can be changed by the malware authors once the app has been approved and posted.

We asked Fabian Libeau, technical director (EMEA) at RiskIQ, if there was anything else special about this particular porn clicker which might account for it being so persistent? 

"The main focus of this malware is about monetising on adverts, in this case porn ads" he said "as the clicks are coming from all these different devices they won't be seen as click fraud and it won't be detected so easily." Libeau also pointed out that it's a pretty good example of how this area of cyber-crime is commoditising. "It is not just the big bad breach which swipes information in an overt attack that's the only focus for cyber criminals but also stealing small amounts here and there" Libeau told SC . "This is certainly a trend we are seeing and which will increase dramatically."

But none of the above really gets to the fundamentals of just why Google appears to be having such a hard time dealing with the Dubsmash porn clicker variants in particular. It's almost as if the company has adopted a chase your own tail methodology where it waits for the author to upload yet another variant using yet another developer account and then takes it down as soon as possible. Which isn't soon enough to prevent users from being infected, of course. Surely there has to be a better way for Google to deal with a malicious developer using multiple accounts and multiple malware variants?

Wim Remes, the strategic services manager (EMEA) for Rapid7, and something of an industry stalwart, points out that hosting an application store actually isn't as easy as it seems.

"You have to take care of multiple stake holders, including your users, developers, and advertisers" Remes told SC, continuing "implementing strict security checks for applications makes it extremely difficult to publish applications within a reasonable timeframe so most application stores implement a looser set of security checks." 

Which is why any number of third party services, often integrating with mobile device management solutions, exist that focus on application reputation and can block these malware carriers. In fact, many IT security professionals are a lot more sympathetic towards Google on this issue than you might imagine. Take Chris Boyd, malware intelligence analyst at Malwarebytes, who suggests it is "to their credit that such a popular service isn't buried under a landfill of rogue apps and dodgy clickers." Boyd says that no matter how rigorously they police the store "something will always slip through the cracks." That's the nature of the beast, with mobile being a space where empowering developers to be creative can bring obvious benefits but also leads to the almost inevitable presence of the off malicious download. "Harnessing the talent of legitimate developers without restricting their freedom is an uphill battle for anybody" Boyd concludes "regardless of who is watching the storefront."

Which prompts the question, who or what is watching the storefront then and can they do better? Fabian Libeau revealed that his company analyses 150 app stores around the world, of which 127 are Android stores, looking for malicious applications. "Google currently has a marketplace of 1.26 million applications" he told us, and that figure is growing all the time. "Of all the apps in the Google store 42,000 applications are blacklisted, which means they are either not legitimate applications or contain malicious content, similar to the Dubsmash porn clicker app." 

So, can Google add more security measures (and there are plenty out there) to try and solve this problem? "Certainly, but what would be the impact on the business model?" asks Libeau "will it mean that there are still free apps going forward?" And there's your real answer, the challenge for Google is one faced by organisations dealing with cyber-crime all over the web, not just in app stores, in that getting ahead of those with malicious intent isn't easy without impacting upon your own business model. When that business model is irrefutably successful, making changes that could impact it are less likely to get executive approval.

Not that this means Google is standing still here, nor that it's doing enough. Let's not forget that it is very uncommon for an app with such similar malicious code to get published in the Google Play Store using different names and accounts like this. Catalin Cosoi, chief security strategist at Bitdefender, thinks that the problem is with the Google reviewing app which, despite announcements a human verification angle to the process, still relies upon an automated scanning process. This looks for "obvious developer policy violations through permissions and known malware" Cosoi explains, and "the system's limited capabilities and specs have been disclosed by researchers, making it easier to bypass." So, for instance, if the apps target zero-day vulnerabilities or come designed with anti-emulation techniques, they may appear safe to Google.

Where Google is moving forward is in the newly announced Android M operating system which will ask users for permissions only when the app needs them, not all at once. So, a camera app will ask for camera or microphone access when it's first launched for example. "Granular permissions will help users understand why and what information is disclosed, thus diminishing the powers of abusive apps" Cosoi concludes. Which won't stop malicious apps such as the Dubsmash porn clicker variants finding their way into the Google Play Store, but it will make it harder for them to monetize their maliciousness. Which is something. The question remains, will it be enough?

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Upcoming Events