YESTERDAY, APPLE CEO TIM COOK published an open letter opposing a court order to build the FBI a “backdoor” for the iPhone.
Cook wrote that the backdoor, which removes limitations on how often an attacker can incorrectly guess an iPhone passcode, would set a dangerous precedent and “would have the potential to unlock any iPhone in someone’s physical possession,” even though in this instance, the FBI is seeking to unlock a single iPhone belonging to one of the killers in a 14-victim mass shooting spree in San Bernardino, California, in December.
It’s true that ordering Apple to develop the backdoor will fundamentally undermine iPhone security, as Cook and other digital security advocates have argued. But it’s possible for individual iPhone users to protect themselves from government snooping by setting strong passcodes on their phones — passcodes the FBI would not be able to unlock even if it gets its iPhone backdoor.
The technical details of how the iPhone encrypts data, and how the FBI might circumvent this protection, are complex and convoluted, and are being thoroughly explored elsewhere on the internet. What I’m going to focus on here is how ordinary iPhone users can protect themselves.
The short version: If you’re worried about governments trying to access your phone, set your iPhone up with a random, 11-digit numeric passcode. What follows is an explanation of why that will protect you and how to actually do it.
If it sounds outlandish to worry about government agents trying to crack into your phone, consider that when you travel internationally, agents at the airport or other border crossings can seize, search, and temporarily retain your digital devices — even without any grounds for suspicion. And while a local police officer can’t search your iPhone without a warrant, cops have used their own digital devices to get search warrants within 15 minutes, as a Supreme Court opinion recently noted.
The most obvious way to try and crack into your iPhone, and what the FBI is trying to do in the San Bernardino case, is to simply run through every possible passcode until the correct one is discovered and the phone is unlocked. This is known as a “brute force” attack.
For example, let’s say you set a six-digit passcode on your iPhone. There are 10 possibilities for each digit in a numbers-based passcode, and so there are 106, or 1 million, possible combinations for a six-digit passcode as a whole. It is trivial for a computer to generate all of these possible codes. The difficulty comes in trying to test them.
One obstacle to testing all possible passcodes is that the iPhone intentionally slows down after you guess wrong a few times. An attacker can try four incorrect passcodes before she’s forced to wait one minute. If she continues to guess wrong, the time delay increases to five minutes, 15 minutes, and finally one hour. There’s even a setting to erase all data on the iPhone after 10 wrong guesses.
This is where the FBI’s requested backdoor comes into play. The FBI is demanding that Apple create a special version of the iPhone’s operating system, iOS, that removes the time delays and ignores the data erasure setting. The FBI could install this malicious software on the San Bernardino killer’s iPhone, brute force the passcode, unlock the phone, and access all of its data. And that process could hypothetically be repeated on anyone else’s iPhone.
(There’s also speculation that the government could make Apple alter the operation of a piece of iPhone hardware known as the Secure Enclave; for the purposes of this article, I assume the protections offered by this hardware, which would slow an attacker down even more, are not in place.)
Even if the FBI gets its way and can clear away iPhone safeguards against passcode guessing, it faces another obstacle, one that should help keep it from cracking passcodes of, say, 11 digits: It can only test potential passcodes for your iPhone using the iPhone itself; the FBI can’t use a supercomputer or a cluster of iPhones to speed up the guessing process. That’s because iPhone models, at least as far back as May 2012, have come with a Unique ID (UID) embedded in the device hardware. Each iPhone has a different UID fused to the phone, and, by design, no one can read it and copy it to another computer. The iPhone can only be unlocked when the owner’s passcode is combined with the the UID to derive an encryption key.
So the FBI is stuck using your iPhone to test passcodes. And it turns out that your iPhone is kind of slow at that: iPhones intentionally encrypt data in such a way that they must spend about 80 milliseconds doing the math needed to test a passcode, according to Apple. That limits them to testing 12.5 passcode guesses per second, which means that guessing a six-digit passcode would take, at most, just over 22 hours.
You can calculate the time for that task simply by dividing the 1 million possible six-digit passcodes by 12.5 per seconds. That’s 80,000 seconds, or 1,333 minutes, or 22 hours. But the attacker doesn’t have to try each passcode; she can stop when she finds one that successfully unlocks the device. On average, it will only take 11 hours for that to happen.
But the FBI would be happy to spend mere hours cracking your iPhone. What if you use a longer passcode? Here’s how long the FBI would need:
It’s important to note that these estimates only apply to truly random passcodes. If you choose a passcode by stringing together dates, phone numbers, social security numbers, or anything else that’s at all predictable, the attacker might try guessing those first, and might crack your 11-digit passcode in a very short amount of time. So make sure your passcode is random, even if this means it takes extra time to memorize it. (Memorizing that many digits might seem daunting, but if you’re older than, say, 29, there was probably a time when you memorized several phone numbers that you dialed on a regular basis.)
Nerd tip: If you’re using a Mac or Linux, you can securely generate a random 11-digit passcode by opening the Terminal app and typing this command:
python -c 'from random import SystemRandom as r; print(r().randint(0,10**11-1))'
It’s also important to note that we’re assuming the FBI, or some other government agency, has not found a flaw in Apple’s security architecture that would allow them to test passcodes on their own computers or at a rate faster than 80 milliseconds per passcode.
Once you’ve created a new 11-digit passcode, you can start using it by opening the Settings app, selecting “Touch ID & Passcode,” and entering your old passcode if prompted. Then, if you have an existing passcode, select “Change passcode” and enter your old passcode. If you do not have an existing passcode, and are setting one for the first time, click “Turn passcode on.”
Then, in all cases, click “Passcode options,” select “Custom numeric code,” and then enter your new passcode.
Here are a few final tips to make this long-passcode thing work better:
By choosing a strong passcode, the FBI shouldn’t be able to unlock your encrypted phone, even if it installs a backdoored version of iOS on it. Not unless it has hundreds of years to spare.
Many thanks for your article – it should help many to protect themselves from brute-force intrusions into their iPhones.
I believe your math respecting an “average” number of guesses needed to crack the passcode is incorrect. You assume that 50% is that number of tries. Statistically you have to compute the number of failed tries that would be necessary to get to 50%. For example, in a six digit numbers-only passcode, the chance of failure is one minus the reciprocal of 10^6, or 1-1E-6, or 0.999999. How many consecutive failed tries have a 50% chance of happening without a success, i.e., 0.999999^x = 0.5? To compute that you have to divide the log of 0.5 by the log of the probability of failure for each trial (0.999999 in this example). So the solution is: x = log(0.5)/log(0.999999) = 693,147 tries, not 500,000 tries as you would have it. When you do this for various significantly long passcode lengths and character choices, you will find that the 50% chance point is consistently around 69-70% of the total possibility number.
Again, thanks for the great article!
I wonder how long an iPhone will work before failure in such a brute force attack scenario. I doubt the phone will functional after a few decades, maybe even less. Anyone have a guess at to the MTBF?
Micah, your article is spot on! Not sure it is so practical but with an iPhone which has a fingerprint reader would be a good solution. However, I would like to point out that by common usage of language it should be understood that Apple’s iPhone design inherently has a backdoor built into it because the architecture is such that the security can be bypassed by modifying the OS and/or firmware. A design without a backdoor would not allow this kind of attack. The FBI is not asking Apple to build a backdoor. They are asking Apple to write an EXPLOIT which will in fact exploit the existing backdoor.
I’d say more they’re asking Apple to write an exploit that would exploit the existing security bug. One that, according to today’s news, Apple is planning on patching: http://www.nytimes.com/2016/02/25/technology/apple-is-said-to-be-working-on-an-iphone-even-it-cant-hack.html
what about android devices?
Great info. But may I suggest using an alphanumeric passcode as far my knowledge goes its more secure then using just numbers as passcode. Use aplhabets (few caps and few small) numbers and some special symbols.
iPhone lock screens only have 0 1 2 3 4 5 6 7 8 9 as options for the combination.
No longer true. I use both upper/lower case alphabet, special characters, and numbers in my iPhone 6.
The effort by the FBI to force Apple to crack open phones should be compared to FBI use of Stingray cell-site simulators, which allow any phone, including an iPhone, to be tracked without a warrant – a good overview of that issue is available from the Electronic Privacy Information Center.
As of Sep 2015, the Justice Department says that it will now only use such devices if a warrant is first issued – but Stingray systems have been delivered to dozens of local police agencies, who have purchased them using Department of Homeland Security grants (San Francisco, Baltimore, dozens of others etc.). These devices can also capture the cell phone data transmissions (conversations, texts, GPS, etc.). The local police agencies in question refuse to discuss their use of the devices, using the same boilerplate response based on “several state law statutes, sections of the US Code, and Executive Order 13637, that President Barack Obama signed last year.” (this is probably what the FBI told them to say, and local police agencies still use Stingrays without warrants)
Similarly, if the FBI gets its hand on what one could call an Apple iOS simulator, it could break into any phone and take all the data off it. Local police agencies, repressive foreign governments, organized criminal gangs, would soon have the same capability.
What is also telling is where the FBI investigation is being directed – towards an attack on the privacy of American citizens – and what it is ignoring – that is, investigating Farook’s wife’s ties to terrorist groups in Saudi Arabia and Pakistan – for example, Farook married his wife in Saudi Arabia, with group of unknown people in attendance. But what does the FBI have to say about that?
“I want to be crystal clear here — we do not see any evidence so far of a plot outside the continental U.S.” – David Bowdich, the F.B.I. assistant director in charge of the Los Angeles field office (NYTimes Dec 07)
Yes, that’s the right response for an FBI official with long-term career plans – and this is the fundamental problem – as just one example, Saudi Arabian and UAE military forces are currently co-operating with Al Qaeda Arabian Penisula (AQAP) to launch attacks on Houthi rebels fighting in Yemen, and that’s the same AQAP behind the bombing of the USS Cole.
However, for diplomatic and arms-dealing and oil-money reasons, investigations of Saudi-linked terrorism by FBI agents are not allowed – for example, FBI Agent John O’Neil’s attempt to investigate the USS Cole bombing in Yemen was scuttled by the State Department and his FBI career was terminated soon after:
http://www.pbs.org/wgbh/pages/frontline/shows/knew/could/
Another point is that trained spies (and trained terrorists) follow the ‘hide in plain sight’ rule, as did the 9/11 hijackers – who made all their plans in face-to-face meetings, and who posed as wealthy young Saudis learning to fly jetliners – they probably wouldn’t even encrypt their phones at all, and certainly didn’t use them to discuss their plans.
All told, it’s pretty obvious that the FBI is just hoping to use this international terrorism case as a Trojan horse to increase their abilities to conduct domestic mass surveillance of American citizens.
Micah,
Thank you for the article. I also read and enjoyed your article on using Diceware to create super strong passphrases some time back. Started using Diceware and a password manager since then. :)
Just curious, do you personally employ this 11-digit passcode on your phone now, or use a different method like Diceware?
Thanks!
When I was doing the math for this article, I switched to a brand new entirely random 11-digit passcode. It turns out that if you type it into your phone like 20 times a day, it takes less than a day to memorize (at least for me).
Thank you for the quick response! So would using Touch ID along with this defeat the whole purpose? Reading other articles, it’s always mentioned that since the introduction of Touch ID, it’s been possible for users to have much stronger passcodes than normal, since they’ll rarely need to enter them manually. But at the same time, you can apparently be forced to give up your fingerprint but not your passcode. Do you use or suggest Touch ID use? Perhaps a mix of the two at certain times?
Thanks!
You shouldn’t use Touch ID to unlock your phone. I think it’s totally fine for approving App Store purchases or for Apple Pay, though.
A story always need a hook but it is not sufficient to talk about encryption only with respect to the iPhone. What about those of us who have smart phones that are not iPhones? And so on. Needless to say the problem is more general. Would it be satisfactory if iPhones were FBI- and NSA-proof but everyone else who does not have an iPhone would be subject to unwarranted snooping but carries them at their own risk if they do not consent to the government’s invasions of their privacy? That can’t be right. My Samsung Galaxy bill is $50 per month, the average iPhone bill is double that or more, etc.
I don’t know what this stuff about testing on device or off-device is about. Clearly, the testing would need to be done against the device itself because the secret key can’t just be copied to another device. What could potentially speed it up is if the key derived from the PIN and hardware could be generated off device – that’s the process that takes 80 milliseconds to generate per passcode attempt – but it would still need to be tested on the device itself, as far as I know.
Start calling things by its name. why?
Torture=enhancement interrogations techniques?
encryption=know it all to store it all
the land of the free= worst than all the kosher stalinist, kosher bolchevist, kosher hitlerianism, kosher satanjewhoo combined.
A thumblock for about 10 seconds will have you putting your password in as fast as you can. What a bunch of fools you people are. Ask any poor black kid how nice the cops can be……..
My cheapo BLU phone and SW allow 3 tries and shut you out for a time period. If they really want into your phone; you and your phone will wind up in Guantanamo for a little waterboarding persuasion or worse at one of the “black” sites still in use. The velvet glove is only camouflage for the sword. The “Brave new world” control can be replaced by 1984 any time they want.
Hi Micah,
Thanks for your hard incredibly important work. I typed the command in the terminal app on my MacBook Pro numerous times now and have not generated a random passcode of any number of characters. What am I missing?
Thanks again.
Angus.
Thank you! Do you get an error message? After you press enter it should just display 11 random digits.
Micah Lee, thank you again! And I’ll say it again: write that book!
Micah,
The 11-digit number doesn’t encrypt/ decrypt the iPhone. It only decrypts the built-in mechanism in the iPhone that encrypt/ decrypts the device. It’s like a key to a lock that has its own built-in mechanism.
Apple can very easily and mischievously put in a master-key to work that mechanism, and no one else would be any wiser.
Secondly, it should be possible to isolate the operating system and intercept it before it gets to asking the password, and then redirect the system to the decryption mechanism regardless of what is entered.
I think Apple is doing the right thing to fool all the terrorists and make them feel secure using Apple products so we can go drone them down to their Allah the great.
-H
As I say in the article, “The iPhone can only be unlocked when the owner’s passcode is combined with the the UID to derive an encryption key.”
But you’re right, if Apple wants to hide a backdoor in how iOS disk encryption works it’s entirely within their power. Of course that would mean that they lie about how the disk encryption works in their whitepaper, they’d probably have to keep it a closely guarded secret even from Apple engineers who work on the code, and they’d run the risk of getting discovered and completely losing the trust of their users.
If there is any truth to this speculation, it’s possible for you to find evidence by inspecting an iOS firmware image, which you can download using iTunes to flash to an iPhone. But since there isn’t any evidence, and there’s plenty of evidence to suggest that iOS is engineered as advertised, it doesn’t make sense to believe there’s already a disk encryption backdoor.
Thanks. So in a sense we can find out from its firmware image whether or not the OS has a backdoor? That’s some comfort.
Wouldn’t the time needed to crack it go up if you used letters and numbers? instead of 10 choices per slot it would be 36, or even more if you include punctuation. So a 6 slot code would be a minimum of 36 e6th or 2,176,782,336.
Go buy an iPhone and discovery why not.
Starting with iOS 7: http://www.ianswerguy.com/alphanumeric-passcode/
You can indeed use AlphaNumeric passwords.
I have a much better idea: get rid of your unnecessary and environmentally harmful iPhone that turns you into an android.
Doesn’t matter how long or strong your passcode is if the platform is compromised to begin with.
We have seen enough now to know that iOS, and OSX are not secure, it is unlikely they ever will be.
To boot there are very strong indications that Apple is co-operating with the government behind the scenes.
What this amounts to is theatre to allow Apple to save face before they admit their products are compromised, this was a financial “best interest” decision to do this.
You would have to be naive in the extreme to think Apple cares about your privacy when they make so much money violating it.
You seem way too intelligent to be an American!
Except if they simply replace the hardware to be able to guess faster.
You need to explain the software stack on the Iphone.
There will be some other level or hardware/software/microkernel below IOS that could be retaining the equivalent of key press information, which would make all this a PR exercise for Apple and the Goverment against the gullible couch potatoe public and prestitute media.
I’m with Bernie on this one. When you pry a iPhone from a terrorists cold head hands, you have what Sherlock Holmes would call “a clue.” The authorities can use whatever legal tools at their disposal to exploit that clue.
Someone is conflating legitimate law enforcement with the illegitimate mass surveillance and other abuses, i.e. national security letters, no-fly lists, kill-lists, parallel construction, etc.
I keep advocating for a C++ library with Strong Encryption. Similar to the boost libraries or added to the Standard Template Library. If you want security, you have to build it from the ground up.
The implicit question here is “Who do you trust, Apple or the FBI?”
The correct answer is: “None of the above.”
The implicit question is does Apple (or anyone else) have a right to produce a device with security features which can’t be disabled? That is, must all devices have a backdoor?
It is possible the courts will rule that passing laws is the responsibility of the Legislative Branch, and that in absence of a law mandating that all products have a backdoor, they can’t compel Apple to create one. However, this is an archaic point of view from the days when all the courts could do was compel third parties to turn over evidence directly related to a crime. Nowadays, law enforcement can compel the active cooperation of third parties to solve crimes. Those withholding their full cooperation become accomplices in the crime – in other words, terrorists.
So the courts will determine if Apple has the capability to build a back door. If the answer is yes, then they will be required to do so. This saves a lot of unnecessary debate in Congress. In fact, Congress itself has ceased to be relevant; perhaps the Capitol could be turned into a prison for bad Apples.
First of all, benitoe, large corporations, especially those with the size and sheen of prestige that Apple has, can do essentially what-ever they want to do. You know that.
*If a large corporation person (b/c the SCOTUS has said Corp.s are people too.) woke up one day and declared you owe them lots of money, for Lord knows what, what are you gonna do? Nothing, that’s what.
Secondly, the Capital is already plumb full of bad Apples (and, theoretically, it only takes one to spoil the whole bunch.) but I wouldn’t necessary call it a “prison” iykwim.
The FBI sees Apple making out with the NSA and the CIA and feels neglected. So they’re trying to horn in on the action.
I think the FBI is being a little gauche by going public. Apple operates in China, so if they let the FBI enter by the backdoor, they’ll have to do the same for China. So yes, Apple can do whatever it likes, but it doesn’t want a reputation for being too easy. That could affect its market value.
And maybe you’re right about the Capitol – keep it for the top class criminals.
Apple has already cooperated to an incredible degree with FBI’s investigation, and handed over all information they have available. The question now is: Can the government prevent a company from selling secure products so that law enforcement can spy on users when they want to? If FBI wins, then they have precedent to force any company to backdoor any product.
I disagree. Your question has already been answered affirmatively, by the Snowden incident. The answer is yes. The government(s) of the world can compel pretty much anybody to do pretty much anything. That’s sort of their job. They do, however, seem to have some trouble breaking high level encryption. Putting your faith in the legal system or good will corporations is misguided. Strong Encryption is based on mathematics. That is real power in the hands of real people. Better to put your faith in the maths.
The reason I keep bringing it up, is that right now is an opportune time to develop a C++ library for this stuff. The number of people who can actually write a correct, high level, encryption library is small. The number of people who can write advanced C++ template libraries is also small. Capitalism is about bringing talented people together to solve technical problems. Talented people and a bit of money can solve this problem.
Using the Boost random library, as guide to develop this encryption library, will make development go quickly. You can enlist crypto currency programmers to help develop crypto-server technology. The money is there. The people are there. The solution is optimal.
BTW, Great column, I enjoy reading it.
There are already high-quality C++ encryption libraries, like the Crypto++ library, which any C++ programmer can include in their project. There are similar libraries for all programming languages.
There are no example code snips showing how the classes are used and some of the classes have no descriptions at all.
This is not a high quality C++ library–>”This reference manual is a work in progress. Some classes are lack detailed descriptions.”
It has potential though.
Remember, no matter how secure your passcode, the FBI can torture you until you reveal it. So equally important to having a strong passcode is to practice to increase your tolerance to pain. G. Gordon Liddy, who worked at the FBI but is more well known for his role in Watergate, used to demonstrate this at parties by holding his hand over a candle until it burned. Luckily, you just have to hold out long enough to give 10 incorrect passcodes, and then the phone will wipe the data. I’m not sure if Apple will change the default setting for those who have lower pain thresholds, but if not, I’m sure it will be included in their next iOS.
I’ve not heard this suggestion before but the next step for device manufacturers is to give us multiple options for entering multiple passwords. For example we can create a password to unlock and another to secure erase everything, or multiple finger print options with the same idea. The person or organization forcing to unlock your device will never know what happened. This is probably very easy to implement .
So Micah if you know the right people please pass along my idea.
….. with quantum computers around the corner having a self-destruct pin number is critical, the idea is we have 3 numbers, 1 to unlock and 2 others to self-secure-wipe, so if your unlock # is 123456, then your self destruct numbers will sandwich this number, 123455 and 123457, therefore no amount of computing power can crack open your device without it self-erasing.
The FBI should ask the NSA to query Bluffdale. I am sure their keystroke loggers have stored the pass phrase there. This drama is just a ruse to turn public opinion against encryption.
Not only that it is also theatre to allow Apple to appear to care for their customers privacy and rights, when IMO they are complicit in the entire thing.
“It’s also important to note that we’re assuming the FBI, or some other government agency, has not found a flaw in Apple’s security architecture that would allow them to test passcodes on their own computers or at a rate faster than 80 milliseconds per passcode.”
By assuming that (sincerely), you’re giving away the fact Snowden’s files were mute on this specific topic. In other words, From-Moscow-With-Love doesn’t have the relevant files, and neither does he have the other files which were part of the same directory.
But you’re also assuming Apple itself hasn’t built any backdoor into its 5C model, whereas most if not all new electronic devices have been equipped with one since the beginning of e-time before being dumped on the market. In other words, Apple couldn’t access its own product against the user’s will if it tried… Supreme level of class and dignity for a company whose fiscal habits might have led us to doubt such an hypothesis…
Bottom line : everyone seems to be accepting the premise, namely that there’s no backdoor configured as it is, AND that the FBI hasn’ yet managed to crack the phone’s encryption on its own.
I’ve read quite a few TI op-eds in which the MSM’s gullibility (conscious or not) when it comes to national security matters and information handed to them by government agencies was (rightfully) pilloried as a noxious parrot-syndrome.
Since neither this article nor Jenna McLaughlin’s provide any (technical) explanation as to why you’re assuming what you’re assuming, I’m enclined to wonder how the same thing couldn’t be objected to TI’s coverage of this story for the past days…
Two technical questions :
1/ Let’s say Farook walked past an IMSI-catcher days or weeks before the San Bernardino massacre, carrying his operational iPhone. Logically, the passcode he entered earlier that day got stored somewhere on the phone. Why wouldn’t the device catch that piece of information as well ?
2/ Answering another comment, you wrote : “[a]nyone who has a copy of the backdoored version iOS, as long as it’s digitally signed by Apple’s key, could install it on an iPhone in their physical position”. Suppose a regular user (not a techie) is partly using the cloud to store some data.
a/ What would prevent any hacker from injecting malware into a third-party’s architecture, which the user would then automatically download onto his phone without even noticing it, thereby altering passcode features ?
b/ Energy is carrying meaning. Indeed, bits get carried through electricity. Provided enough energy is generated, how could any passcode prevent a radar similar to the CTX4000 featured in a 2013 conference by Jacob Appelbaum (What do they say again ? “Hail Master Troll !”) from exfiltrating electric data from any (i)phone whatsoever ?
As I haven’t read all comments below, I hope mine is not in any way redundant…
It’s easy to speculate about existing backdoors in iPhones, but without any evidence it’s only speculation. If there is an existing backdoor, evidence of it would exist in the iOS firmware or in iPhone hardware. Until someone finds such evidence though, I’m going to believe the more likely scenario that iPhones security works the way Apple describes it in their technical whitepaper.
The passcode you enter on an iPhone does not get stored on disk at all — that would be a terrible security choice, and there’s no reason for it. And that’s really not how IMSI catchers work.
The iOS passcode functionality can only get changed with an iOS update, not through any data synced to the cloud. Sort of like how the functionality of the Windows start menu can’t change because someone puts a new file in your Dropbox account.
I think you’re trying to describe side-channel attacks. Maybe it’s possible there’s a side-channel attack that can be used to extract the UID from an iPhone, but no one has published any research about this yet.
Thank you for your answer. Some things keep puzzling me, still.
I hear what you’re saying about speculation : the sky is blue. One can pretend it’s actually orange, but until one produces irrefutable evidence it is, there’s no reason to give the assertion any credit. In other words, scientific value can only be established through (repeated) observation.
Beside your argument being used by the MSM in the broader context I described, however, wouldn’t that come down to equating a political actor (profit-motivated Apple) with nature ? Unless he sat down with the company execs throughout the whole iOS development process, why would a journalist grant that political actor (Curtis Mayfield’s definition) the benefit of the doubt he systematically denies trad institutional players ? Is absence of evidence evidence of absence in this case ?…
Following the same logic, if we can agree most if not all electronic devices have been equipped with backdoors since IBM saw the light of day, whether consciously or not (https://truesecdev.wordpress.com/2015/04/09/hidden-backdoor-api-to-root-privileges-in-apple-os-x/), wouldn’t Apple’s 5C model not being equipped with one constitute something of a “natural” disruption worth examining ?
The new iPhone was released in September 2013. Earlier models were already equipped with the wipe-function after ten erroneous passcodes. Snowden came out of the closet in June of that year. So, basically, we can rule out radical security changes having been implemented in Apple’s then-latest model because of the NSA-gate, since the time lapse to do so was too short…
____________
1/ I understand the subtlety, yet : https://zerodium.com/ios9.html
and : https://www.youtube.com/watch?v=qhYUgno9IYk
a/ How could a surveillance state often described as the StaSi on steroids not have the iOS’ full schematics ? Can’t an iOS update be mimicked ?
b/ What are they waiting for ? :-)
nice
Lost in the discussion of the Apple Backdoor the FBI wants is the technology transfer that will occur as one or more Apple employees leave to go work for the NSA. No matter whether Apple keeps this backdoor-ed version of iOS in a vault, they can’t do the same with their employees.
Great article Micah:
Thank you for helping to protect our security and privacy. Men like you, Snowden, Assange and Greenwald help to keep us informed of the overreach of our authoritative government.
Great article Micah….
Question is this 11 digit passcode just numeric or is it character based (you can do either on the iPhone, by default its numeric).
If you use TouchID just power it off when the bad guys are moving in, you have to enter the passcode once to enable TouchID on startup with iOS.
All of these estimates are for numeric-only passcodes. You could have a shorter alphanumeric passcode and achieve the same security, but you have to make it actually random.
Pass phrases are the best way.
Pick a sufficiently long sentence froam a favourite book or play or stanza fron a poem or even a song chorus and you’ve stopped em dead in the water even if they use super computers. Just don’t be stupid and use something super obvious and short like “Open says me” and you are all good.
Easier to remember too.
If alphanumeric passcodes are possible, then one can greatly increase security just by using hexadecimal (base 16) numbers. Hex digits are the numbers 0-9, plus the letters A-F. A 10-digit hex passcode has just under 1.1 trillion possible numbers — a bit more than a 13-digit decimal number. 12 hex digits (the number of possible MAC addresses) allows for ~281.475 trillion choices! To estimate the time required, multiply the 13-digit time noted in the article by 281.475!
Micah, question with respect to the following:
Wouldn’t it be Apple that installed this?
Secondly, would repeating this hypothetical process require Apple maintaining possession of the actual phone? Or could it do so remotely and therefore subject all device owners to the process, regardless of who has the phone?
Thanks
Anyone who has a copy of the backdoored version iOS, as long as it’s digitally signed by Apple’s key, could install it on an iPhone in their physical position. It could be Apple, it could be the FBI, or it could be Chinese secret police, or it could be a Russian crime syndicate.
Thanks. I feel the part about requiring physical possession of the phone is being overlooked or downplayed by folks like yourself.
I’d imagine that the odds of hackers: (1) gaining possession of my phone, (2) having access and the technical expertise to utilize the iOS alternative software on my phone, (3) and then brute-forcing it’s way past encryption, is a negligible security risk. I also think the odds of the Government unlawfully seizing my phone and obtaining a warrant from Apple to extract data seem far-fetched.
Unless the “backdoor” can be exploited en-masse without possession of a user’s actual phone, I just don’t see why this Magistrate Judge’s order is such an affront to security.
You might not have to worry about it, but I presume that you’re not living under the threat of a repressive regime, you don’t get searched at checkpoints, and you probably don’t have important secrets to keep. As I wrote in the article: “If it sounds outlandish to worry about government agents trying to crack into your phone, consider that when you travel internationally, agents at the airport or other border crossings can seize, search, and temporarily retain your digital devices — even without any grounds for suspicion.”
Notwithstanding my previous clarifications of the details of the court’s order, Micah is absolutely right: creating a 13-digit passcode will make brute force attacks computationally infeasible.
And following his other suggestions will also greatly improve your security.
I’m confused. The article states that after four incorrect attempts to enter a pin, there is a delay that begins at one minute and grows longer with each wrong guess. So how would the government be able to test 12.5 pass codes per second?
The backdoored version of iOS that the FBI wants Apple to create removes those time delays. The 12.5 guesses/sec is the fastest the FBI could brute force your passcode assuming they succeed in getting their malware developed.
Insanity…meanwhile, don’t forget to keep sharing all your personal information about you, friends, and family on facebook, twitter, myspace, etc…photos too..on a daily or hourly basis. As long as your phone is secure, right?? What secrets doyou have on there, anyway?
I wonder whether they would have more luck using the suspect’s finger to unlock the phone? Surely they still have his body somewhere…?
Yes but when making false stories about false flag shootings, pointing out the obvious just wont do, the police have forced living suspects to unlock their phones by forcing their hand, literally.
The iPhone in question is an iPhone 5c, which doesn’t have a fingerprint sensor. Also, if you don’t use Touch ID for 48 hours, you need to use the passcode to unlock it.
It’s a 5C. No TouchID.
It was a iPhone 5c. No touchID.
It’s a 5C. Meaning it does not have the fingerprint reader.
For the sake of accuracy, please note that the court’s order provides for the possibility that disabling the “nn attempts and it’s wiped feature on the “subject device” be performed at an Apple facility and that, in such a case, Apple the provide the Fibbies with remote access to phone for the purpose of attempting to brute force the passcode.
“In the matter of the search of an Apple iPhone seized . . .”
Tim Cook maintains, probably correctly, that even this procedure would create a serious risk that the code would escape into the wild, but, conducted in this manner,” the process would not require handing the code to the Feds.
Did you note the above, Micah?
Please forgive one missing and one random quotation mark. My old eyes, the bifocals and the lack of an editing function here are wearing me down. ;^(
“the FBI can’t use a supercomputer or a cluster of iPhones to speed up the guessing process. That’s because iPhone models, at least as far back as May 2012, have come with a Unique ID (UID) embedded in the device hardware. Each iPhone has a different UID fused to the phone, and, by design, no one can read it and copy it to another computer.”
I’m not a apple user
but can’t the FBI just Spoof the device unique ID (UID) and a device group ID (GID)?
I have seen many post’s about how to find the UID & maybe this for the GID https://www.theiphonewiki.com/wiki/GID_Key#Potential_attacks
https://www.theiphonewiki.com/wiki/Firmware_Keys
I just feel the FBI can do more on there own.
also one thing that the news networks keep missing is that this iPhone is on October 21, 2015 iOS v9.1 or maybe older but not newer then Oct 2015. So that alone should help the FBI get into the phone.
Google 13B143 vulnerability
just food for thought.
There are no guides for learning what the UID is. It’s designed to make it so you can’t extract it from the phone, specifically to require brute force attacks to happen on the device itself, and to be slow. From the iOS security whitepaper:
It may not be easy to extract from the phone, but that’s not the same as being impossible. It has to be encoded by some physical arrangement of the device, and that means a physical-attack tool can discover it.
Encrypted and clear data – let’s call clear data side where system software resides “light” side, and encrypted device, volume, or area “dark” ™ side.
Serious encryption system cannot depend on “light” side for it’s sthrength. This side can be optimized, developed, replaced, or simply backdoored. Only thing on which we can count is encryption algorithm, cypher and cypher mode, but also on hash derivation function for password.
Apple probably uses PBKDF2, but 80ms is as it looks like, hardcoded default for iteration time. Linux LUKS/dm-crypt has –iter-time option where you can prolong this time to for example 4-5 seconds. It is a bit annoying to wait while unlocking, but it guarantees that without very expensive computing cluster and hope that you don’t have 14+ alphanumeric symbols password or passphrase, FBI or similar agency cannot do much in resonable period of time.
It might keep the FBI out but probably not more motivated organizations. These things are usually broken by criminals. When the FBI catches the criminal they get their modus operandi or grimoire, which is where they pick up these “hacks.”
For example, you could clip the power pins on the chip “in situ” and apply a processor emulator (glomper) to read out the GUID and memory. Once you have dumped the memory into an emulator you can run a brute force attack.
There is probably a tech crib somewhere with a bunch confiscated i-phones to experiment on and bunch of dufusses sitting around drinking coffee working on this stuff. You do this sort of thing in non-critical situations before you need it. So that when you need it the procedure is well practiced. This is about medium level tech. High level techs will pull out a squid and sniff the data directly.
Data recovery services are profitable a industry in more ways than one. Pulling data off trashed banking computers is old but sneaky hack.
Apple may well keep a copy of the phone’s GUID. If they have it, the government has it, whether Apple likes it or not.
But does does installing software require the pass code?
I think it just requires the software to be signed by Apple.
First time i’ve seen ‘she’ used as a gender neutral pronoun in an online article. Do I just read the wrong blogs?
Do not use on-line passcode generators!!! You can not be sure they do not save the generated passcodes, or use a real source of entropy (fake random). You could use something like diceware, http://world.std.com/~reinhold/diceware.html mapped to a numeric-only pin – don’t just roll a single 6 sided dice 10 times…
You can if you understand the code. The code
python -c 'from random import SystemRandom as r; print(r().randint(0,10**11-1))'is a simple python script.First, it imports the SystemRandom object from python’s random module:
from random import SystemRandom as rNotice the random module docs say: “Warning The pseudo-random generators of this module should not be used for security purposes. Use os.urandom() or SystemRandom if you require a cryptographically secure pseudo-random number generator.” And the SystemRandom object docs say: “Class that uses the os.urandom() function for generating random numbers from sources provided by the operating system.” So this line of code loads up only the cryptographically secure SystemRandom object and calls it r.
Then the next line:
print(r().randint(0,10**11-1))can be broken up into two parts,r().randint(0,10**11-1), which generates the random number (in a cryptographically secure way), and aprint()function that just displays that number to the terminal. Ther()part of it initializes a SystemRandom object, and then calls therandint(a,b)function on it (see docs for this function). Basically, it returns a random number between a and b. In this case, a is 0 and b is 10**11-1, which is 99999999999.So in all, the one-liner imports the SystemRandom object, makes an instance of it, uses it to generate a random integer between 0 and 10^11, and print it to the screen. It uses a real source of entropy, and it doesn’t save it anywhere — if it did you’d be able to see the code that made it do that.
Note that SystemRandom does not load numbers from ‘real source of entropy, but from cryptographically secure pseudo random number generator (CSPRNG), that collects entropy from multiple sources from disk IO operations to keyboard timings. Thanks for including the one-liner to your article, it’s one of the best ways to do create passcode. D10 dice is another great option.
You can use Touch ID, just only register your middle finger. Then if you’re instructed to use your finger to unlock your phone, you can use up the five allowed attempts with your index finger.
Nice article. However, the estimated time for brute force approach is based on the assumption that the device UID can’t be read. It is reasonable to expect that having possession of the hardware would allow a skilled actor to do so. Certainly in less than thousands of years. :)
Can you provide a reference as to why you believe it is not possible?
I actually believe it probably is possible, just very expensive and I haven’t heard of a case where anyone has ever actually done it yet. If Apple engineers did their job really well it’s plausible that extracting the UID is too expensive to ever be worth it (but I don’t really know). Hence the disclaimer: “It’s also important to note that we’re assuming the FBI, or some other government agency, has not found a flaw in Apple’s security architecture that would allow them to test passcodes on their own computers or at a rate faster than 80 milliseconds per passcode.”
What actor can be more skilled than the NSA – if the UID is readable I assume they can. I also have to assume they know any zero day vulnerabilities already, there must be none they can exploit
.
Micah et al.
For the general user with only a smartphone who might not have a Mac or want to use Windows’ PowerShell (Get-Random cmdlet) to generate random number passwords, how secure would you think using an on-line password generator such as the one from Norton (see link below) would be while using the following add-ons and connection encryption:
1). Norton’s page is HTTPS encrypted (TLS_RSA_WITH_AES_256_CBC_SHA,256 BIT KEYS,TLS1.2)
2). HTTPS Everywhere with Block All HTTP Requests enabled
3). NoScript with only Norton’s JavaScript allowed (ensighten.com and google-analytics.com both disallowed/blocked)
Using your example of 11 random numbers (the Norton Password Generator will allow password lengths of 4 to 32), a user generated a list of 50 numerical passwords (maximum allowed).
If a user selects just 1 of the 50 passwords — perhaps randomly — memorizes, or only briefly writes it down on paper and does not copy/paste it to a text or word processor, how secure a process would you rate this?
Would the security diminish markedly if a user ever so briefly copy/pasted the generated number from the Norton website to a word document on the user’s computer before entering it into an iPhone passcode keypad?
The reason for the last question is that some users of password managers sometimes copy and paste their usernames and passwords from the managers to the website login dialog boxes when websites will not allow users’ credentials to auto-fill. I have assumed that (but often wondered if) a copy/paste or directly typing credentials in is as secure as the auto-fill function.
Of course, after copy/pasting, users need to ensure that they clear their computer clipboard to avoid accidentally pasting an un-encrypted password/credential elsewhere.
https://identitysafe.norton.com/password-generator#
.
Linking to this security disaster is irresponsible! For the love of data security do not use this link!
If you’re that paranoid about your security, go to a game shop and buy a 10 sided dice for ten cents, and roll it a few times. If you’re extra paranoid, destroy the dice afterwards.
Also, for online passwords, they don’t protect you against the government, since the FBI is just going to subpoena the data anyway–they have no need to guess your password. In a rare case where you’re using something like protonmail.ch where the e-mail is encrypted to a key, you’re better off with autofill. You’re more likely to use a secure password (mine are all randomly generated strings) and you’re secure against keylogger attacks.
re: “If you’re extra paranoid, destroy the dice afterwards”
I laughed at that
maybe too hard, eh
I don’t know anything about hacking passwords. If the authorities were trying the brute force attack to gain a password, would they be able to discern the length of the password at the beginning, or would they have to start at three or four character passwords and work their way up to eleven characters?
The entity trying to find your password has no special way of finding out how long the password is if the password is hashed (which nearly all major services like apple do), apart from the obvious: looking at minimum and maximum (eg. “you’re password must be at least 6 characters long”) password lengths, and counting the amount of spaces if we’re considering the iphone numeric keypad passcode.
So short answer; there’s no hacky way to find the length of a password, only really obvious means
error: “The SUBJECT DEVICE is owned by Farook’s employer, the San Bernardino County Department of Public health” say the court documents.
Can this be done on Android? I’d love to see an article on this for us Droid fans…
You can enable encryption and use a strong passphrase. Unfortunately it seems to be the same passphrase for encrypting your storage and unlockingbyour screen. I am not sure if this iis still the case with more recent releases of Android, but it clearly sucks to have to type a strong passphrase to unlock your screen – something you’ll do often, in public, or make the tradeoff of using a weak passphrase out of convenience.
I wonder if use of the fingerprint option is easier to hack if they have fingerprints on file?
Yes, if the device is stolen while powered on then the fingerprint unlock can be bypassed by using your ‘cooperative’ fingers. Or, a jelly thumb could be made after lifting a print from a drinking glass, or perhaps the device itself. However, the first unlock after power on will still require a pin entry. If I remember correctly, the fingerprint unlock times out after a period of time and a pin is required to unlock the device. Devices are generally more protected when powered off, and before key material is in memory. Turn off your devices and pull the battery (sorry iPhone users) before crossing borders.
What Cook did, he refused to do what is impossible if there is no back door in Apple iOs already. If they can update iOS without owner permission it is a back door by itself.
So why FBI is asking for it? It is a coordinated PR stunt aimed also at Chinese government and customers to gain their trust while US sales are collapsing. It is disgusting that people will believe it his good intentions of defending privacy.
DFU mode.
I’m no skilled Linux user, but I copied the script – as Micha displayed it above – pasted it into a command line … and presto! Generated 6 different strings just for kicks. Got two that – for me – are easily memorized.
Thanks, Micha! (And, thanks for reassurance via twitter, too.)
far shorter version to get random pin on these systems:
ruby -e ‘p (rand*1e+11).to_i’
I’m way more of a python person than a ruby person. But does this implementation generate cryptographically secure random numbers? A shorter, but not as secure, python implementation would be:
python -c 'import random as r; print(r.randint(0,10**11-1))'— but don’t use that, use the one in the article instead, as it gets numbers from /dev/urandom. https://docs.python.org/2/library/random.html#random.SystemRandomHey Micah,
Any chance of an android article of the same caliber coming up? I know I am not the only one who would appreciate it. Thanks!
Micah, all 3 versions get randomness from the same source – your local machine entropy service. without special extra steps, such as installing haveged [1] or consumer hw stick [2] that source is relatively weak – there is not that much entropy on user machine. however, you collecting only 37 bits of entropy, that source is “good enough” for task at hand. (btw “urandom” means “unblocking random”, meaning it will *not* block entropy generation even if local entropy pool is empty, so for secure operations /dev/random is prefered which actually blocks until machine get enough entropy generated)
One simple way to add extra entropy is take 11 numbers by your method, take another set of digits from [3] and add them digit by digit on piece without carryover (i.e. 1+2 = 3, 5+6 = 1)
Overall, as long as FBI has no access to your local machine state when numbers are generated (true to virtually all your readers) any local method to generate 11 digits is “good enough”. And of course there is trivial, instant and absolutely unbeatable time proven method too: [4]
[1] https://wiki.archlinux.org/index.php/Haveged
[2] http://www.amazon.com/TrueRNG-Hardware-Random-Number-Generator/dp/B00T0XKAQM
[3] https://www.random.org/integers/
[4] http://www.amazon.com/Chessex-Dice-Sets-Opaque-White/dp/B001S6TV14
Let’s say for argument Apple complied. How do you force an iOS upgrade onto the locked and encrypted phone? I get the feeling these people asking for this are not very tech savvy.
You think 11 digits is hard to brute force? Try 256 bit encryption keys. They’ll never get into it.
They can force an upgrade when the OS is signed with Apple’s master key. That’s why it has to come from Apple.
Better hope that the NSA didn’t already crack AES.
Excellent journalism…keep it up and thanks!
The Secure Enclave exists on all iPhone models 5S and newer. What the requested 5C backdoor (which is only a conceptual backdoor) does is create a precedent whereby the government requires access to newer phones too. The only way to give that access on Secure Enclave phones is to literally bake in a secret master key in the encryption algorithm. One key that would break through the encryption on any iPhone in the world.
You have to wonder if Sauron didn’t use the same tactics as the FBI.
“C’mon, Celebrimbor. If you make these rings we’ll all be safer. Make one for all the kings. They’ll have the power to rule and protect! No, I promise I’m not going to make a master ring. Secret’s safe with me.”
inb4 op is a feddo and the python interpreter is compromised..
All the FBI needs to do in order to get the data off the internal “Hynix H2JTDG2MBR 128 Gb (16 GB) NAND flash” is to take apart the terrorist used iphone that was passcode protected. Remove the flash memory from the logic board then take another iphone 5c with no passcode and swap out the flash memory by re- soldering it to the new iphone’s logic board that does not have a passcode. Put it back together and access the data with no need for a passcode.
Except I’m pretty sure all components have a UID that would prevent this. Now, I could be wrong, so feel free to correct me.
The whole point of encrypting the data is so attacks like this cannot work. If the FBI could just put an in circuit programmer and read the flash chips directly they would not need Apple’s help.
Hi Micah. Thanks for this. So your argument for this, as opposed to Diceware alphanumeric passcodes, is principally convenience? In other words, there’s no reason for people already using Diceware to switch over to this approach?
Precisely. If you’re happy with your diceware passphrase it’s totally safe to stick with it.
Great article as usual. May I humbly suggest that it would benefit from a brief description of how a single die may be used to generate a truly random passcode.
A single die has bias. Use more dice, and randomize the order the dice are read – see diceware.
The intercept should know better than to refer to the “killers” as such. They’re allegedly killers. They’re allegedly guilty. There was never a trial. There was never enough evidence to hold a trial. And that evidence was tainted by “journalists” breaking into their apartment before “formal” investigative measures could be taken by “authorities”.
Like I said, the Intercept should know better than to exacerbate the propaganda of the State. Poor form.
I thought of this too. They were conveniently killed (like Osama bin Laden and Jack Ruby!) before any investigation.
have you been able to memorize your SSN? your phone #your DL #
A SSN has 9 numbers I remember that one. Phone, yep 10 numbers DL # yes 8 numbers. 11 numbers is not that much more they are easier to remember broken in to groups of 2-3 like your SSN.
I am sure the fbi would have all that info on file
Re-read the article. It warns against using anything personal (such as SSN, birthdate, phone number, etc.), as they are predictable.
Or you could re-read the comment and see he is reminding us that memorizing longish numbers is commonplace, even in our mentally enfeebled age.
Have there been instances where Apple has delivered data to police/government officials that they recovered from an iCloud backup? Those backups, per Apple, are encrypted, but perhaps not using a user’s passcode: https://support.apple.com/en-us/HT202303.
It’s been reported that the account for the iPhone use by the person in the San Bernadino case had iCloud backups that but the most recent one was dated many months before the attacks. The FBI was able to gain access to that data but it had limited value, hence their intense focus on getting data from the phone.
Micah Lee has referred to the spooks in the feminine gender. Apart from Cressida Dick I wonder how many females are there in this business.
The real threat is from North Koreans and Chinese. Maybe some Turkeys and Ukrainians as well, but they are not as smart as Mongoloids. Those guys will introduce bugs that will prompt users to enter passcodes. FBI can also do the same, but as usual they are too dumb to follow Muslim folks closely while they are still alive and kicking. Waterboarding successful suicide-bombers is a very bad idea.
What the FBI could do is install bugs that will prompt the user to enter her passcode,
Another way is to provide free public wifi access and install lots of cameras in the area. The subject is probably going to use this facility someday. You can then use biometric facial recognition to identify the person and watch her enter the passcode in the taped camera films. Obviously, all of us are going to fund the project.
If you need a random number from a Windows computer, run powershell and type “Get-Random 999999999999″
Are your local backups safe if the cops seize your server? Could they copy it into their cloud and brute force it there?
You make local backups to your laptop, so if cops can access data on your laptop they could then brute force the backup password with less hassle, yes. But they can’t just send a data request to a company to access this, like they can with iCloud backups.
So, what exactly is on my phone that I’m worried about the government having? #notaterrorist #helpyourself #nothingtoseehere
You have to first be convinced that the FBI and gov’t wants your phone, presumably for shits and giggles and to monitor your revolutionary comments on TI. Fight the power!!
Or maybe TI is just trying to appeal to the lucrative “aspiring criminal” media demographic?
https://youtu.be/XEVlyP4_11M?t=25m17s
There are people in this world who help keep the secrets, thoughts and security of themselves and others in their email, photos, etc.
If you’ve nothing to hide, please post your email address and password here. Or leave your doors unlocked and post your address.
Wanting privacy is not the same as hiding something.
Does repeating this process require Apple’s taking possession of the actual phone or can it be done remotely?
The point about long passcodes improving security is an excellent one, but there are a couple more things that need to be borne in mind:
Do not fixate on a particular length. One of the reasons why so many passcodes are so insecure is that their length is known apriori. So think about using a passcode of length longer than 11.
Do not use an easily remembered phrase or simple substitutions (like 1 for i, or 3 for e, to name two common instances). If you can program or even have good spreadsheet skills, design a program or spreadsheet to generate random codes.
Do not do stupid things like writing the code down and placing the piece of paper in your wallet or purse, or use the same passcode for multiple devices or web sites unless you do not care if they are stolen. There are free applications such as Password Vault (http://www.s10soft.com/passwordvault.htm) that you can install that encrypts and stores all your passwords, so that all you need is one (very long) password to access them.
And by the way, even if you have your home wireless network secured, unless you password protect access anybody can circumvent the security provisions by simply plugging an ethernet cable into the router and logging on as the admin. So for goodness sake change the admin password!
Not to say that this is bad advise, but if an attacker can gain physical access to your router, your wireless security is the least of your concerns.
Well, for example if your setup allows someone to log on as a guest, they can in principle access the router without entering your house, or even being on your property. It is as easy as hijacking your IP address, which Google has done on a widespread basis in the US, just driving around collecting them while taking pictures for their streetview service.
http://www.cnn.com/videos/us/2015/12/02/san-bernardino-shooting-possible-suspects-black-suv-sot-feyerick.cnn
Stop repeating lies. The interviewed witnesses say the killers are not the Muslim couple that have been accused. I have watched 3 interviews of the witnesses. They say it was three masked tall athetic white men in army fatigues with ak47s. Stop the lies. I expect better from Intercept.
A couple of suggestions:
– lobby Apple to change the ‘require password with touchid’ from 48 hours to a user chosen number.
– lobby Apple to change the # of attempts before wiping the phone.
– if you *do* use Touchid, use a pinky or finger on your non-dominant hand and deny one is set or to screw with anyone forcing you physically.
– if you *do* use touchid and are put into a position (imminent arrest, border crossing) turn off your phone, which will trigger the ‘passcode required’ feature on restart.
– instead of numeric only, use a long’ish pass-phrase with spaces, characters, caMelCasE when possible.
– use an app like 1password – with a different but equally secure pass-phrase – to store sensitive notes, contacts, files, bookmarks. Just in case someone does get in.
Great article Micah. Keep up the good work.
These would be great features for Apple to add to iOS. And actually, I think it makes sense to stick with a numeric passcode rather than alphanumeric. I did the math and 11 digits is safe enough, and quicker and more convenient to type in compared to using a full keyboard on an iPhone. It’s also easier to generate a secure one. If you do want to use an alphanumeric passphrase, you’d want to use something like Diceware.
Micah, I’m surprised you only mention Diceware in the comments and not in the actual article. I think you should at least add a sentence or two about Diceware, it’s worth letting people know about it.
11 digits may result in 127 years of crack time, but a 3 or 4 word Diceware passphrase results in 596 or 4637441 years respectively. It’s not incredibly hard to type, as it could also be only 11 characters long. I suspect there’s also some future-proof benefit to Diceware too: what if someday they discover how to crack passcodes on real supercomputer hardware, off of the iPhone itself? Number passcodes would be totally screwed but Diceware passphrases would buy a lot more time.
Passphrases are still numerical. They are base26 as opposed to base10 so they are stronger per character but still susceptible to numerical solution.
It’s actually possible to change the number of attempts before a device wipe using the Apple Configurator tool. It would be nice if some of the other parameters could be adjusted from there, however.
Does your estimate assume the brute force attack knows how many digits your code is? If they don’t know how long your passcode is, they are likely to try all the shorter ones first, correct?
Likely, yes. Each digit you add to your passphrase increases the number of combinations by 10 times. So trying all 6-, 7-, 8-, 9-, and 10-digit passcodes before working on 11-digit ones will add 28 years to the 127 year average for an 11-digit passcode.
They’ll be pretty embarrassed when, after spending 28+253= 281 years to try all the 6, 7, 8, 9, 10 and 11 digit passcodes, they get to my passcode of twelve 1’s. It’s not foolproof, I know, but its still a viable strategy for those of us who can’t remember more than a single digit.
Mr. Mussolini, you are required to appear at [redacted at our convenience] an interview at our [redacted alphabet soup agency.] We have some questions about recent information that has been Intercepted.
Not likely: certainly.
Not certainly as I doubt brute force works sequentially. I’ll wager they check a bunch of 6s then 7s then 8s etc. and make a scatter pattern with emphasis on certain numerical strings, like song lyrics, or 1s or 2s … (sorry Benito)
That’s OK. I couldn’t remember whether it was twelve 1’s or eleven 1’s and now it’s wiped out all my data.
*chuckle*