Poking around with Pokémon: why app developers need to address permission abuse

Ken Munro discusses why Pokémon Go is symptomatic of a wider problem - permissions abuse by mobile apps - and the implications of this for the user and app developer

Ken Munro, partner, Pen Test Partners
Ken Munro, partner, Pen Test Partners

Pokémon Go is the ultimate bandwagon for malware. The global phenomenon has proved highly popular, using nostalgia to cross the generation gap and appeal to younger demographic through the clever use of augmented reality, and it's that popularity that has seen unofficial versions spring-up on third party app sites. But the take-up of those doctored apps might have been minimal were it not for the fact that game developer, Niantic, inadvertently fuelled demand by pulling the official version just a week after launch.

It turned out the legitimate app gave full access to Google accounts, allowing the app to access the user's email credentials, photos, search history and Google Drive files. One could argue that gamers should have changed those access permissions… except that's not an option. The Apple iOS prevents the user from doing so while the app doesn't appear in the Google account security permissions. After being alerted to the access issue, Niantic sought to restrict those permissions to the basic username and password needed to play the game. But in pulling the app it created a void that malicious versions were quick to exploit.

What does this tell us about app development? Firstly, that the developers did not prioritise the information required. There was a blanket approach to the permissions requested upon sign-up. Niantic is not unique in this respect. From a marketing perspective, many providers believe it makes perfect sense to request more permissions than you need to ensure access to data that can be monetised in the future, as technology advances. But more data means more responsibility and that could see app developers potentially compromise user privacy and/or fall foul of the regulator if they fail to take the precautions needed to protect that data.

Secondly, the provider's disclosure process was not as well oiled as its marketing machine. It was unable to fulfil demand but also unable to reassure its user base or warn them about bogus versions. A speedy response not just in restricting permissions but in reaching out via the media to alert users to a new release date would have made more wary of attempting to obtain the app via other means. So poor was communication that even those who disclosed the vulnerability were not informed about the fix, with Adam Reeve reportedly saying “the Pokémon site is for some reason not accepting new sign-ups right now” on Tumblr.

Thirdly, and again Niantic is not alone here, no-one seems to be looking at the long game. Application permissions are effectively terms and conditions that give the app owner carte blanche access to the user's device and its data. It's not just malware such as the DroidJack Remote Administration Tool (RAT) found on the illicit app that can give full access to device controls; there are plenty of perfectly legitimate permissions that involve the user voluntarily giving up access to device controls.  These requests range from ‘modify audio settings' to ‘record audio' and ‘prevent phone from sleeping', for instance, and effectively give away information on everything from the user's location to their conversation.

Such overly permissive permissions could easily expose the user to abuse.  It's technically feasible to record audio feed today in real-time today. A proof-of-concept Android app has been shown to do precisely that, capturing audio feed, converting it to text, and alerting the creator to key words that the user had spoken. From a marketing perspective, this may sound highly compelling. But without transparency, user opt-in/out and regulation to ensure data is protected it opens a Pandora's box.

Part of the problem with mobile apps is that there has been a disenfranchisement of the user and this in itself is dangerous. It's often difficult or impossible to alter these permissions retrospectively but it hasn't always been this way. The facility to control permissions was in the Android OS as far back as Jelly Bean, but removed from Kit Kat and Lollipop. Google really dropped the ball with this, effectively paving the way for mass permissions. For instance when Messenger came to Facebook it had made no less than 34 permission requests. Opting out, just like with Pokémon Go, wasn't an option unless you refused the app. However, with Marshmallow Android now a great granular permissions feature. It means that users can select the permissions that an app is allowed eg Calendar, Camera, Contacts, Location etc can all be customised.

It's this kind of grey area that forthcoming legislation namely the EU General Data Protection Regulation or its equivalent (there's talk of the Data Protection Act being revised for instance) will seek to address. The legislation will tighten data protection requirements, making app developers and marketers more responsible for how data is used, transmitted and redacted, and the penalties for failing to protect data adequately will be substantial.

For Pokémon Go, this has been far more than a coding error. It's been a lesson in putting security and the user first and in evaluating how you deal with disclosure. It may well have cost them a few users, particularly if we now begin to see angry malware-infected players vent their spleen, but it could also have been a highly costly mistake if it had been made in two year's time. Perhaps the best advice is for app developers to actually think about permissions and how they could put potential users off, and for users to make a judgement call on apps that demand too much access. If you can set unique permissions then great. If you can't then think about what you're actually allowing, and weigh up whether a kids game is worth the risk.

Contributed by Ken Munro, partner, Pen Test Partners

close

Next Article in Opinion