In a much-publicized open letter last week, Apple CEO Tim Cook pledged to protect user privacy with improved encryption on iPhones and iPads and a hard line toward government agents. It was a huge and welcome step toward thwarting the surveillance state, but it also seriously oversold Apple’s commitment to privacy.
Yes, Apple launched a tough-talking new privacy site and detailed a big improvement to encryption in its mobile operating system iOS 8: Text messages, photos, contacts, and call history are now encrypted with the user’s passcode, whereas previously they were not. This follows encryption improvements by Apple’s competitors Google and Yahoo.
This isn’t the first time that Apple has oversold the security of its products.
But despite these nods to privacy-conscious consumers, Apple still strongly encourages all its users to sign up for and use iCloud, the internet syncing and storage service where Apple has the capability to unlock key data like backups, documents, contacts, and calendar information in response to a government demand. iCloud is also used to sync photos, as a slew of celebrities learned in recent weeks when hackers reaped nude photos from the Apple service. (Celebrity iCloud accounts were compromised when hackers answered security questions correctly or tricked victims into giving up their credentials via “phishing” links, Cook has said.)
While Apple’s harder line on privacy is a welcome change, it’s important to put it in context. Yes, a leading maker of smartphones, tablets, and laptops is now giving users better tools to lock down some of their most sensitive data. But those users have to know what they’re doing to reap the benefits of the new software and hardware — and in particular it helps if they ignore Apple’s own entreaties to share their data more widely.
Although Apple was listed as an October 2012 addition to NSA’s PRISM program in documents leaked by former NSA contractor Edward Snowden, Cook denied that his company has ever worked with any government to provide special ways to circumvent its security systems.
“I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services,” Cook wrote in his open letter. “We have also never allowed access to our servers. And we never will.”
The most prominent privacy improvement Apple made to its products last week is a new encryption feature built-in to iOS 8.
Since the iPhone 3GS, all iOS devices have supported encrypting personal data such as text messages, photos, emails, contacts, and call history. If you set a passcode it would be used to encrypt some, but not all, of the data on your device. Apple was still able to decrypt some of the data without knowing your passcode.
If law enforcement confiscated your phone and wanted to snoop at certain types of its data, all they would have to do is serve Apple a warrant and to get a copy of said data. A version of Apple’s Legal Process Guidelines for U.S. Law Enforcement dated May 7th, 2014 explains:
Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data.
But if you’re using iOS 8, at least some of that personal data is encrypted using your passcode, namely SMS, photos, contacts, and call history. In their Government Information Requests page, Apple brags that this new feature makes it technically unfeasible to comply with this government request to retrieve such data from an Apple devices:
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.
The improved encryption in iOS 8 is a great move towards protecting consumer privacy and security. But users should be aware that in most cases it doesn’t protect your iOS device from government snoops.
While Apple does not have the crypto keys that can unlock the data on iOS 8 devices, they do have access to your iCloud backup data. Apple encrypts your iCloud data in storage, but they encrypt it with their own key, not with your passcode key, which means that they are able to decrypt it to comply with government requests.
In order to fully enjoy the benefits of keeping your crypto key private, you should also turn off iCloud syncing for any data that you consider private. For example, if you’d like to keep your contact list safe from prying eyes, you might turn off iCloud syncing of contacts. However in this case privacy comes at a price: if your iPhone breaks and you get a new one, Apple won’t be able to restore all of your contacts for you.
But there’s another risk that comes with relying on your passcode for the security of your device: 4-digit PINs are not that hard to guess. Even if you don’t use one of the most commonly used passcodes, weak passcodes are vulnerable to brute force attacks (when the attacker guesses every possible passcode until she finds one that works). For iOS 8’s encryption to really work, you’re better off using a longer passcode that includes letters, numbers, and symbols. And that, of course, makes unlocking your phone a pain. On some devices, you can use Apple’s fingerprint reader to ameliorate that pain, but security researchers have repeatedly shown that the reader can be defeated.
Then there’s the question of video and audio recordings. Apple did not indicate if that personal data is now encrypted with the user passcode (it was not previously). The company did not respond to a request for comment on that question.
This isn’t the first time that Apple has oversold the security of its products. Shortly after the PRISM revelations were published in The Washington Post and The Guardian, Apple denied that it was part of the program and issued a statement claiming that “conversations which take place over iMessage and FaceTime are protected by end-to-end encryption so no one but the sender and receiver can see or read them. Apple cannot decrypt that data.” But security researchers showed that Apple could indeed eavesdrop on iMessage conversations without the user knowing.
Regardless, Apple’s attempts to bring end-to-end encryption to iMessage are leaps ahead of some other popular messaging services, such as Facebook messaging, which doesn’t do anything to prevent Facebook itself from reading your conversations. At least not yet. At the moment, privacy seems to be enjoying a resurgence; since the Snowden revelations began, both Google and Yahoo have started to build end-to-end encryption into their email services, making them less vulnerable to government requests for data as well. What remains to be seen, for Apple and all the others, is how deep beneath the surface, and into their own infrastructure, tech giants will be willing to go. For Apple, as for its competitors, there is still plenty of work to be done.
Photo: Vincent Yu/AP
One thing the author and every other commenter is ignoring is the fact that even though an iCloud account may be able to be decrypted by Apple, all Mac users can simply use PGP or GnuPG to encrypt the files on their Mac that they want to sync or backup on their iCloud account, thus preventing Apple or anyone else from decrypting them.
Even better, you could use the MEGA cloud service (mega.co.nz), which encrypts your files with a key that only you have, so nobody else can decrypt them. They give you 50 GB of storage free, with more available for paying customers. Because they and their servers are in New Zealand, they are beyond the reach of U.S. law enforcement or the alphabet soup agencies.
This is why we should all lie about the data we give to companies. If the data they are using is false, then its of no use to them or the government. Setting up a fake email / name, using TOR and signing up for an iCloud account is very simple to do. You can always do a pay plan, under a different name, and pay for it online using pre-paid debit cards you can get from Wal-Mart using cash.
If everyone did this, there’d be no way they’d be able to monitor us all.
QUOTE: “We have also never allowed access to our servers. And we never will.”
Dies Apple mean to say they have never been served with a PATRIOT ACT letter?
Apple’s goal is to lock people into their technology by touting a “seamless experience” across “all of your devices” so they can keep selling you stuff. You don’t need a corporation to have that. You can do it yourself, it’s not expensive, it’s not that difficult and it’s more secure.
I dumped iCloud nearly four years ago because of Apple’s ongoing operational glitches. I bought the hardware to set up my own personal, encrypted cloud in my home office. I run it and my Macs only when I’m actively using them. Only one machine is hooked up to the web and I turn off the modem when I’m not using that machine. I turn on my cloud to selectively sync at the end of each day. When I travel, I backup to a data stick, keep my laptop clean and sync when I get home. I use ProtonMail and *never* store critical email or docs on my devices but on a data stick (like Edward Snowden did) tucked away in a secure location.
Disabled the Mac and iPhone microphones. A piece of electrical tape works just fine for the cameras.
My iPhone 4 is slowly dying and will be replaced by an even older Razr which only does phone calls. I use it on the road but not at home because there’s a mountain between me and the cell tower at home/work. Just as well.
Nothing that the companies say to the public can be trusted, since they are required by superceding secret laws to lie to the public whenever ordered to. And that is likely their inclination already.
Step 1: Buy a used Samsung Galaxy S 3.
Step 2: Install Replicant.
Step 3: Encrypt your internal storage and SD card.
Step 4: Use apps like Redphone, TextSecure, ChatSecure, Orbot + Orweb w/ transparent proxying, and Courier.
As someone who doesn’t store any critical data on my iPad or in the cloud, my main concern are the cameras and microphones in the iPad. I would have liked Tim Cook to have said something about this spying vulnerability. Until Apple offers a feature allowing me to disable these in the Settings, I’ll continue to use my front and back camera covers.
One thing to add about the fingerprint function – according to this article (http://www.wired.com/2013/09/the-unexpected-result-of-fingerprint-authentication-that-you-cant-take-the-fifth/) you may not be able to invoke the 5th Amendment as protection against law enforcement compelling you to give your fingerprint. You may have that protection with a passcode.
If you turn off your iPhone before the police get the phone, then you’ll be good, as the phone can’t be unlocked after initial startup via the fingerprint (you have to use your passcode the 1st time after a restart, then fingerprint is enabled).
Thank you for this article Micah Lee.
This is exactly the type of information that potential consumers and users of IOS devices need.
The truth is that Apple and Microsoft OS devices are government friendly by design. This is accomplished through services installed and updated on the devices.
In other words, what good is encryption when your data can be extracted by Apple or Microsoft background OS services?
See: “Forensic scientist identifies suspicious ‘back doors’ running on every iOS device”
http://www.zdnet.com/forensic-scientist-identifies-suspicious-back-doors-running-on-every-ios-device-7000031795/
In short, these devices can not be secured from software services and/or hardware installed by equipment manufacturers which specifically allow for government data collection. Buyer beware.
The coverage of the passcode feature leaves out the most important option. You can select to wipe your phone after a specified number of invalid passcode entries. So, for example if you loose your phone and it’s passcode protected it can be set to wipe all data after the 10th failed login. Now, they don’t volunteer that it’s really wiped (as in overwriting all area’s with 1,0,random values,repeat…), but it does add more value to the feature.
I also agree that “Cloud” isn’t secure. “Just say no to any cloud if your want privacy/security”
“Cook denied that his company has ever worked with any government to provide special ways to circumvent its security systems”
Does passively looking the other way while a government screws a democracy movement in the Arab world count? Freedom House and ‘friends’ (like the National Democratic Institute and other CIA fronts) trained the secular ‘spring’ student movement in Egypt with Apple products for revolutionary coordinating purposes, prior to the secular movement’s leadership rounded up and shipped off to Omar Sulieman’s jails. How interesting this event became an endeavor where Morsi and the generals were ultimately on the same page:
http://www.theepochtimes.com/n3/90887-egypt-convicts-ngo-workers-including-16-americans/
What do you suppose any of this might have to with Apple’s history of dragging its feet to fix known security flaws?
Gee, I hadn’t penned a relevant satire to attach to this article, BUT Sarkozy did announce his comeback on Friday, the French resemble us in at least one regard:
http://ronaldthomaswest.com/2013/05/19/maison-de-lhistoire-de-france/
They threw the criminally corrupt out of office only to replace Sarkozy with criminal incompetence (Hollande), resulting in, you might have guessed it, opening the door to return the criminally corrupt to office. But hey, Sarko is a survivor, like Burlesconi, he loves the immunity from prosecution that comes with the job (as prosecutors look into whether he took millions in illegal cash from Gaddafi, nothing quite like murdering the witnesses against you, eh?)
The snooping is one thing. But it’s how the Feds are using the data that is really a big problem. Does anyone here really think that secret data-mining of every American’s entire digital existence is not already being used to sabotage and disrupt those individuals whom the Establishment deems a threat? Remember, Cointelpro was used to illegally and extrajudicially punish many people who were simply fighting for social justice and equal treatment under the law. Shortly before he was assassinated, MLK was declared a threat of the highest magnitude by our lovely FBI. They attempted in every way possible to disrupt and discredit the man. They even sought to blackmail him into killing himself.
Now, almost 50 years later, is there anyone who doubts that the alphabet soup agencies are at their Cointelpro and MkUltra games again? Anyone doubt that they’ve gotten much better at disrupting these “enemies” of the state? Greenwald scratched the surface in late February when he wrote about GCHQ’s efforts to destroy reputations online.
Make no mistake, illegal covert ops are being run against American citizens at the present time. And online disruption is only one of many avenues of attack. Career sabotage, disruption of personal, professional and business relationships, framing for crimes, slander, harassment, etc, etc. As before, those being most vigorously targeted are those fighting for justice, accountability, etc.
Welcome to Amerika!
“Now, almost 50 years later, is there anyone who doubts that the alphabet soup agencies are at their Cointelpro and MkUltra games again?”
The key word here is “again”. They only focused on different groups and individuals in the interim. They just moved south of the border for a while in order to prop up banana republic dictators. Who cares about them spanglish speaking folks anyway? They got no clout, so oppress away.
Thank you for posting, Dr. Each honest remark and discussion about the issue has the potential to help innocent targets in some way, and I simply appreciate your acknowledgement the problem exists.
I’m neither doubter nor spectator, but a long-time target, so I have had plenty of time to form opinions about this peculiar institution and the social environment it thrives in. I’ll refrain from using four-letter words when I express them (this time).
I think most Americans have no clue, and a clue is prerequisite to doubt. Others, who I sometimes refer to as COINTELPRO deniers, have no doubts — they enthusiastically approve, gave consent, and are willing to collaborate with any sufficiently shiny badge — real or not — asking for a little gang-stalking help. I think the most vociferous COINTELPRO deniers are salaried surveillance role players. Others don’t care about anything on their radar which does not personally affect themselves. I would classify them as inconvenience averse doubters. Lots of labels for sure… But the ones I was tagged with — traitor, terrorist, communist, etc., along with the death-threats, stalking, financial / medical interference, and digital monitoring which accompanied those labels — did not leave me sensitive to Americans’ tender sensibilities about their self-image.
Responsibility for COINTELPRO — and all the crap politicians Americans repeatedly promote — lies at the feet of the clueless and the apathetic. I do look forward to the day they are permanently, publicly stained by their totalitarian predilections. They are already on the hook for Bush’s and Obama’s malicious stupidity. Why would anyone with two brain cells to rub together give them the benefit of the doubt?
Indeed, See Dr. Bruce Ivins, Gary Webb, Aaron Swartz, etc. The list of inexplicable suicides grows.
As Micah Lee says, it’s a good start but there is much more to be done. Users shouldn’t have to be tech gurus to figure out which functions are covered by encryption and which are not, or which services will, if used, effectively negate the encryption that the user thinks s/he has put in place. At least several of the tech companies have started competing on user privacy and security. It’s a start.
If one chooses “turn off simple passcode” then enters an N digit numerical code, one still has the numerical keypad for passcode entry, and anyone attempting brute-force will not know the value of N. Combine this with “erase device after 10 attempts” and brute-forcing is essentially impossible, and code entry is easy.
As for the “faith” in iOS 8 encryption, you think there might be a news article or two if someone successfully hacks an iOS 8 device?
I’m not so sure about “erase device after 10 attempts”. That would be scary especially if the user has no backup of his/her current data. How about just waiting a certain amount of time after 3 failed attempts and more time after each failed attempts afterwards. It would also have to be alphanumerical in this case.
Just do local backups on your PC/Mac via iTunes (which you can encrypt by checking a checkbox next to the button to backup) – this also syncs the device so your contacts/data etc. are always ready even if you had a new iPhone and lost your backups for whatever reason.
Of course you should be backed up. That’s a separate issue.
If you’re “not sure” about erase after 10, then you don’t turn it on. Brute-forcing is still essentially impossible because of the escalating time delays between attempts.
Couple of comments…bit of a pessimistic tone of the article – it’s important to keep in perspective Apple is the only major smartphone OS manufacturer where their product is not your information (like it is for Google and Microsoft) – i.e. the honeypot that the Govt. wants in the end. Apple has the best business case for taking privacy seriously out of the big guys (Apple, Google, Microsoft).
With this announcement Apple is the first smartphone vendor making your iPhone data encrypted by default – and here is the big deal, this will put the kabosh on casual/easy snooping of iPhone contents without warrants – that’s the hurdle surmounted here. Many police departments were doing this previously and that will now be shut. Google has announced it will do something similar with their next Android release (they had encryption available previously but didn’t have it turned on) – Microsoft hasn’t said anything which isn’t surprising (as they were called out as an “enthusiastic partner” to the NSA on the NSA’s internal documents).
“This follows encryption improvements by Apple’s competitors Google and Yahoo.” Um, but that wasn’t for smartphones though, this applied to Google’s/Yahoo!’s internal e-mail systems, after it was revealed the NSA was actively harvesting/stealing data from them without the companies knowledge (that wasn’t the case with individual iPhone users). For smartphones, which is what this article was talking about, Google announced that it would follow Apple’s lead with their next Android release.
“However in this case privacy comes at a price: if your iPhone breaks and you get a new one, Apple won’t be able to restore all of your contacts for you.” Bit of a red herring here… If you have iTunes and are not backing up to the iCloud, you can do easy quick encrypted backups to your PC/Mac and even if you didn’t have a backup – if you lost your iPhone and got another, you’d just connect it to iTunes and it’d synch your contact lists etc. from your PC/Mac right into the new iPhone…easy.
I agree 100%, and should have read your post before I duplicated it with mine.
It seems like you’re assuming somebody has to use the user interface on the phone to do their brute force attack. Is that true? Couldn’t the encrypted data be copied off the device and the brute forcing be done off line?
@Dave
No, the brute force attack would *have* to take place on the device itself. The filesystem encryption is seeded with a globally unique ID that not even Apple knows which is embedded in the Secure Enclave portion of the A7/A8 chips that the phone runs on at manufacture time. This entanglement is what also enforces the rate limiting on password attempts (5 second delay between each failed attempt). It would take approximately 14 hours to try all 4 digit pins, and if you chose to use a complex password consisting of 10 upper and lower case English alphabet characters and the numbers 0-9 it would take 9x the age of the universe to try all possible keys (no, really).
Brute force attack becomes impossible with either passcode type if you set the security setting to erase the device after ten failed attempts.
“Brute force attack becomes impossible with either passcode type if you set the security setting to erase the device after ten failed attempts.”
… unless you’re *really* lucky! ;)
Thanks, Glenn!
Apple, of course, has to do enough to persuade people to keep giving it their data. The encryption measures they’ve announced are more than adequate to ensure the continued fealty of the Appletariat. They accept Apple’s proclamations based on faith, since they lack the technical acumen to form their own judgments.
Somewhere, at this very moment, a celebrity is uploading their nude photos to the iCloud.
Cook wrote in his open letter. “We have also never allowed access to our servers. And we never will.”
“And we never will.”
Which in itself proves the statement is a lie (there is no need to elaborate on why – if you don’t already know…well, sorry).
Maybe Apple didn’t have to give the government any special access (a back door) like Yahoo or Google since the security was such a joke that no back door was needed? Although the fact that the Snowden documents listed Apple as a PRISM partner makes me seriously question Tim Cook’s statement. I think he should take a lesson from James Clapper and realize that you are better to say nothing at all then to lie (if he is in fact lying). If Clapper had not lied to Congress, Snowden might still be working as an NSA contractor having not leaked any documents.
I, for one, don’t really believe what the tech giants are telling me. I only half believe Yahoo because I’ve seen documents. NSA/GCHQ et al it seems to me are not just sitting around while these guys encrypt the data they demand. Especially Google which has worked hand in glove with NSA for industrial espionage for it’s own benefit as well as giving them everything they have asked for – they will need to go a long way further than this to regain my trust. Even then, I really don’t need to use their services and I don’t.
If anyone out there is tech savvy enough I’d like to know if verifiable encryption is possible so no trust in any corporation is necessary.
I think your idea of verifiable encryption can be easily integrated. Although the process would be more time consuming for both backup and restoration. I’m not an iCloud user, but I would assume iCloud is like an internet hardrive where users can store and retrieve their data. If that’s the case, users should be able to upload their own password encrypted files. If Apple was really serious about their claim, they should implement an ability to extract data that users want to backup (I don’t know if this can already be done). For example, if I wanted to backup my contacts and pictures, I should be able to extract them separately in one directory. An encryption app can then be used to encrypt and compress the files into one file that can then be uploaded to the iCloud. Gosh, that almost sounds like a passworded zip file. Sounds good to me. Being able to do this can at least assure users that their files ARE encrypted (verifiable encryption). The users should know, because they would have to do it themselves. To restore, users just download the backup file, use the password to decrypt/unzip, and run restore from iPhone. Of course, these passworded backup files would be vulnerable to brute force attacks. The more sensitive the data, the longer and more complex the password should be.
Hackers did it, huh? NSA has hackers. And those nude photos are already available to them. No need to hack. They just want to make some extra dough.
The whole problem with the NSA surveillance thing is that they make the systems vulnerable to anyone, including them. They are more efficient, besides the fact US has most of the global internet traffic directed or passing through them. But that doesn’t mean that other governments or criminals can’t do what they do, specially breaking into the systems intentionally made vulnerable by NSA undermining of protocols and systems. Apple security track record isn’t the best, their itunes connect problem last year didn’t got much of the end user attention, but they had to shutdown their app publication systems for almost 2 months. I use apple products because I have to for my work. But I don’t trust them with even a single bit of any relevant information.
You referred to them as criminals. Is it possible that the ones behind these nude photos could be current or former NSA employees themselves? News say the hacker is on the run from the FBI. Pffft! Like they know who did it. Thats why another one popped up.