Industry experts agree the backdoors are a serious issue, and corporates and individual users are being urged to risk-assess their storage of personal data on Apple devices.
The undocumented features were revealed by forensic scientist and hacker Jonathan Zdziarski, known as ‘NerveGas', at the Hope/X security conference in New York last Friday.
Zdziarski said that Apple's packet sniffing and other forensics services affect all 600 million iOS users. They can dump “mass amounts of personal data” without informing the user and even bypass the ‘backup encryption' service Apple users are given.
The features, which have been developed over the past five years, include DROPOUTJEEP which captures iPhone data including the user's SMS messages, contact list, voicemail and geolocation data; com.apple.pcapd packet sniffing software which dumps network traffic and HTTP request/response data travelling in and out of the device; and com.apple.mobile.file_relay which “transmits large swathes of raw file data in a compressed CPIO archive” and “completely bypasses Apple's backup encryption”.
This file data includes the user's address book, calendar, call history, SMS database, email metadata, voicemail data and audio files, a list of all their social media accounts, caches, GPS logs and all their photos.
Zdziarski summed up: “Apple is dishing out a lot of data behind our backs. It's a violation of the customer's trust and privacy to bypass backup encryption. There is no valid excuse to leak personal data or allow packet sniffing without the user's knowledge and permission.
“Much of this data simply should never come off the phone, even during a backup. Apple has added many conveniences for enterprises that make tasty attack points for .gov and criminals. Overall, the otherwise great security of iOS has been compromised…by Apple…by design.”
Following his talk, Zdziarski said Apple insisted that the features were there for ‘diagnostic' and enterprise IT purposes.
But in a 21 July blog he retorted: “I don't buy for a minute that these services are intended solely for diagnostics. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a back door to bypass it?”
Zdziarski stopped short of accusing Apple of deliberately helping the US spy agencies, but said: “I suspect, based on released documents, that some of these services may have been used by NSA to collect data on potential targets.
“At the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices. My hope is that Apple will correct the problem.”
Industry expert and consultant, Brian Honan, also urged Apple users to take action in light of this research: “Corporates, and individual users, should conduct a risk assessment regarding the storing of sensitive data on Apple devices and determine what additional steps, if any, they should take to protect that data until Apple address the vulnerabilities,” he told SCMagazineUK.com.
Honan pointed out the backdoors can't be used automatically, but still present a serious threat: “In order for them to be fully exploited a number of dependencies have to be in place, such as the device should not have been rebooted since the last time the user entered the pin for the device. This reduces the likelihood of these undisclosed backdoors being maliciously exploited. However, hidden backdoors in any platform are a serious issue and hopefully Apple will address them quickly.”
Analysing the findings, Jon Butler, chief security researcher at UK-based MWR InfoSecurity, emphasised that data can only usually be obtained via a computer that the user agrees to be ‘paired' with.
“Access to these services is controlled by lockdownd, a service which manages trust between the device and computers it is connected to,” Butler told SCMagazineUK.com by email. “Before any of these services can be accessed, a user is prompted to trust the computer their device is connected to. Computers that have not been specifically trusted by the user's device are not able to access these services as a result.”
But Butler pointed out: “Scenarios where this information might be obtainable from computers that have not been explicitly trusted by a user's device include Apple (upon being requested by law enforcement officials) generating a valid PairRecord for a device based on knowledge of some cryptographic key material (likely a private key), or law enforcement officials asking a user to unlock their device during a search so that it can be paired with a computer and the data can be acquired.
“There has also been some speculation that intelligence services may be able to obtain a valid PairRecord for a device by compromising trusted computers used by a target user.”
SC contacted Apple for comments but they did not reply at the time of writing.
* Zdziarski's research emerged at the same time as Apple began to use Transport Layer Security (TLS) to encrypt emails between users of its iCloud service and other third-party providers, in a bid to reassure users after the Snowden revelations prompted concerns that the company co-operated with the US Government. Other providers, including Microsoft, have also begun using TLS in their email services.
Recent reports of Chinese fears about iPhones posing a security risk now appear to have a stronger basis in reality.