TIME AND AGAIN, people are told there is one obvious way to mitigate privacy threats of all sorts, from mass government surveillance to pervasive online tracking to cybercriminals: encryption. As President Obama put it earlier this year, speaking in between his administration’s attacks on encryption, “There’s no scenario in which we don’t want really strong encryption.” Even after helping expose all the ways the government can get its hands on your data, NSA whistleblower Edward Snowden still maintained, “Encryption works. Properly implemented strong crypto systems are one of the few things that you can rely on.”
But how can ordinary people get started using encryption? Encryption comes in many forms and is used at many different stages in the handling of digital information (you’re using it right now, perhaps without even realizing it, because your connection to this website is encrypted). When you’re trying to protect your privacy, it’s totally unclear how, exactly, to start using encryption. One obvious place to start, where the privacy benefits are high and the technical learning curve is low, is something called full disk encryption. Full disk encryption not only provides the type of strong encryption Snowden and Obama reference, but it’s built in to all major operating systems, it’s the only way to protect your data in case your laptop gets lost or stolen, and it takes minimal effort to get started and use.
If you want to encrypt your hard disk and have it truly help protect your data, you shouldn’t just flip it on; you should know the basics of what disk encryption protects, what it doesn’t protect, and how to avoid common mistakes that could let an attacker easily bypass your encryption.
If you’re in a hurry, go ahead and skip to the bottom, where I explain, step by step, how to encrypt your disk for Windows, Mac OS X, and Linux. Then, when you have time, come back and read the important caveats preceding those instructions.
If someone gets physical access to your computer and you aren’t using disk encryption, they can very easily steal all of your files.
It doesn’t matter if you have a good password because the attacker can simply boot to a new operating system off of a USB stick, bypassing your password, to look at your files. Or they can remove your hard disk and put it in a different computer to gain access. All they need is a screwdriver, a second computer, and a $10 USB enclosure.
Computers have become an extension of our lives, and private information continually piles up on our hard disks. Your computer probably contains work documents, photos and videos, password databases, web browser histories, and other scattered bits of information that doesn’t belong to anyone but you. Everyone should be running full disk encryption on their laptops.
Encrypting your disk will protect you and your data in case your laptop falls into the wrong hands, whether it’s because you accidentally left it somewhere, your home or office was burglarized, or it was seized by government agents at home or abroad.
It’s worth noting that no one has privacy rights when crossing borders. Even if you’re a U.S. citizen entering the United States, your Constitutional rights do not apply at the border, and border agents reserve the right to copy all of the files off of your computer or phone if they choose to. This is also true in Canada, and in other countries around the world. If you plan on traveling with electronic devices, disk encryption is the only way you have a chance at protecting your data if border agents insist on searching you. In some situations it might be in your best interest to cooperate and unlock your device, but in others it might not. Without disk encryption, the choice is made for you: The border agents get all your data.
There’s a common misconception that encrypting your hard disk makes your computer secure, but this isn’t entirely true. In fact, disk encryption is only useful against attackers that have physical access to your computer. It doesn’t make your computer any harder to attack over a network.
All of the common ways people get hacked still apply. Attackers can still trick you into installing malware. You can still visit malicious websites that exploit bugs in Flash, or in your web browser, or in your operating system’s font or image rendering engines, or countless other ways. When you visit benevolent websites, network attackers can still secretly make them malicious by modifying them in transit. Attackers can still exploit services running on your computer, such as network file sharing, iTunes playlist sharing, or your BitTorrent client, to name a few.
And of course, disk encryption doesn’t do anything to stop internet surveillance. Spy agencies like the NSA, which taps into the fiber-optic cables that make up the backbone of the internet, will still be able to spy on nearly everything you do online. An entirely different category of encryption is needed to fix that systemic problem.
The different ways you can get hacked or surveilled are too numerous to list in full. In future posts I’ll explain how to reduce the size of your probably vast attack surface. But for now it’s important to know that disk encryption only protects against a single flavor of attack: physical access.
The goal of disk encryption is to make it so that if someone who isn’t you has access to your computer they won’t be able to access any of your files, but instead will only see scrambled, useless ciphertext.
Most disk encryption works like this. When you first power your computer on, before your operating system can even boot up, you must unlock your disk by supplying the correct encryption key. The files that make up your operating system are on your encrypted disk, after all, so there’s no way for your computer to work with them until the disk is unlocked.
In most cases, typing your passphrase doesn’t unlock the whole disk, it unlocks an encryption key, which in turn unlocks everything on the disk. This indirection allows you to change your passphrase without having to re-encrypt your disk with a new key, and also makes it possible to have multiple passphrases that can unlock the disk, for example, if you add another user account to your laptop.
This means that your disk encryption passphrase is potentially one of the weakest security links. If your passphrase is “letmein,” a competent attacker will get past your disk encryption immediately. But if you use a properly generated high-entropy passphrase like “runge wall brave punch tick zesty pier,” it’s likely that no attacker, not even the NSA or Chinese intelligence, will ever be able to guess it.
You have to be extremely careful with strong disk encryption that can only be unlocked with a passphrase you’ve memorized. If you forget the passphrase, you get locked out of your own computer, losing your data forever. No data recovery service can help you, and if you give your machine to the FBI, it won’t be able to access your files either. Because that’s kind of the point of disk encryption.
Once your computer is on and you’ve entered your passphrase, your disk encryption is completely transparent to you and to the applications on your computer. Files open and close as they normally would, and programs work just as they would on an unencrypted machine. You won’t notice any performance impact.
This means, however, that when your computer is powered on and unlocked, whomever is sitting at it has access to all your files and data, unencumbered by encryption. So if you want your disk encryption to work to its full potential, you need to lock your screen when your computer is going to be on while you’re away, and, for those times when you forget to lock it, you need to set it to lock automatically after, say, 10 minutes of idling.
It’s also important that you don’t have any other users on your system who have weak passwords or no passwords, and that you disable the guest account. If someone grabs your laptop, you don’t want them to be able to log in at all.
There are a few attacks against disk encryption that are tricky to defend against. Here are some precautions you can take.
Power off your computer completely (don’t just suspend it) when you think it’s at risk of falling into someone else’s hands, like right before going through customs when entering a new country. This defends against memory-based attacks.
Computers have temporary storage called RAM (otherwise known as memory), which you can think of as scratch paper for all of your software. When your computer is powered on, your software is constantly writing to and deleting from parts of your RAM. If you use disk encryption, as soon as you successfully unlock your encrypted disk the encryption key is stored in RAM until you power your computer off. It needs to be — otherwise there would be no way to encrypt and decrypt files on the fly as you use your computer.
But unfortunately, laptops have ports that have direct memory access, or DMA, including FireWire, ExpressCard, Thunderbolt, PCI, PCI Express, and others. If an attacker has access to your computer and your disk is unlocked (this is true even if your laptop is suspended), the attacker can simply plug a malicious device into your computer to be able to manipulate your RAM. This could include directly reading your encryption keys or injecting commands into your operating system, such as closing the screen lock program. There is open source software called Inception that does just this using a FireWire cable and a second laptop, and there’s plenty of commercial hardware available too, like this one, or this one. It’s worth noting that new versions of Mac OS X use a cool virtualization technology called VT-d to thwart this type of DMA attack.
Demonstration of a cold boot attack.
Photo courtesy of J. Alex Halderman et al. "Lest We Forget: Cold Boot Attacks on Encryption Keys"
But there are other ways for an attacker to learn what’s in your RAM. When you power your computer off, everything in RAM fades into nothingness. But this doesn’t happen immediately; it takes a few minutes, and an attacker can make it take even longer by physically freezing the RAM. An attacker with physical access to your powered-on computer can use a screwdriver to open the case of your computer and then use an upside-down can of compressed air to freeze your RAM (as in the image above). Then the attacker can quickly cut the power to your computer, unplug your RAM, plug the RAM into a different computer, and dump all of the data from RAM to a disk. By sifting through that data, they can find a copy of your encryption key, which can then be used to decrypt all of the files on your hard disk. This is called the cold boot attack, and you can see a video of it in action here.
The key takeaway is that while your encrypted disk is unlocked, disk encryption doesn’t fully protect your data. Because of this, you may consider closing all your work and completely shutting down your computer at the end of the day rather than just suspending it.
It’s also important to make sure your laptop is always physically secure so that only people you trust ever have access to it. You should consider carrying your laptop with you wherever you go, as inconvenient as that may be, if your data is extremely important to you. When traveling, bring it with you in a carry-on bag instead of checking it in your luggage, and carry it with you rather than leaving it in a hotel room. Keep it with a trusted friend or locked in a safe when you can’t babysit it yourself.
This is all to defend against a different type of disk encryption attack known, in somewhat archaic language, as the “evil maid” attack. People often leave their laptops in their hotel room while traveling, and all it takes is one hotel housekeeper/elite hacker to foil your disk encryption.
Even when you use full disk encryption, you normally don’t encrypt 100 percent of your disk. There’s a tiny part of it that remains in plaintext. The program that runs as soon as you power on your computer, which asks you to type in your passphrase and unlocks your encrypted disk, isn’t encrypted itself. An attacker with physical access to your computer could modify that program on the tiny part of your disk that isn’t encrypted to secretly do something malicious, like wait for you to type your passphrase and then install malware in your operating system as soon as you successfully unlock the disk.
Microsoft BitLocker does some cool tricks to make software-based evil maid attacks considerably harder by storing your encryption key in a special tamper-resistant chip in your computer called a Trusted Platform Module, or TPM. It’s designed to release your encryption key only after confirming that your bootloader hasn’t been modified to be malicious, thwarting evil maid attacks. Of course, there are other attacks against TPMs. Last month The Intercept published a document about the CIA’s research into stealing keys from TPMs, with the explicit aim of attacking BitLocker. They have successfully done it, both by monitoring electricity usage of a computer while the TPM is being used and by “measuring electromagnetic signals emanating from the TPM while it remains on the motherboard.”
You can set up your Linux laptop to always boot off of a USB stick that you carry around with you, which also mitigates against evil maid attacks (in this case, 100 percent of your disk actually is encrypted, and you carry the tiny unencrypted part around with you). But attackers with temporary access to your laptop can do more than modify your boot code. They could install a hardware keylogger, for example, that you would have no way of knowing is in your computer.
The important thing about evil maid attacks is that they work by tampering with a computer without the owner’s knowledge, but they still rely on the legitimate user to unlock the encrypted disk. If someone steals your laptop they can’t do an evil maid attack against you. Rather than stealing it, the attacker needs to secretly tamper with it and return it to you without raising your suspicions.
You can try using bleeding-edge tamper-evidence technology, such as glitter nail polish, to detect if someone has tampered with your computer. This is quite difficult to do in practice. If you have reason to believe that someone might have maliciously tampered with your computer, don’t type your passphrase into it.
Defending against these attacks might sound intimidating, but the good news is that most people don’t need to worry about it. It all depends on your threat model, which basically is an assessment of your situation to determine how paranoid you really need to be. Only the most high-risk users need to worry about memory-dumping or evil maid attacks. The rest of you can simply turn on disk encryption and forget about it.
TrueCrypt is popular disk encryption software used by millions of people. In May 2014, the security community went into shock when the software’s anonymous developers shut down the project, replacing the homepage with a warning that “using TrueCrypt is not secure as it may contain unfixed security issues.”
TrueCrypt recently underwent a thorough security audit showing that it doesn’t have any backdoors or major security issues. Despite this, I don’t recommend that people use TrueCrypt simply because it isn’t maintained anymore. As soon as a security bug is discovered in TrueCrypt (all software contains bugs), it will never get fixed. You’re safer using actively developed encryption software.
BitLocker, which is Microsoft’s disk encryption technology, is only included in the Ultimate and Enterprise editions of Windows Vista and Windows 7, and the Enterprise and Pro editions of Windows 8 and 8.1, but not the Home editions, which is what often comes pre-installed on Windows laptops. To see if BitLocker is supported on your version of Windows, open up Windows Explorer, right-click on C drive, and see if you have a “Turn on BitLocker” option (if you see a “Manage BitLocker” option, then congratulations, your disk is already encrypted, though you may want to finish reading this section anyway).
If BitLocker isn’t supported in your version of Windows, you can choose to upgrade to a version of Windows that is supported by buying a license (open Control Panel, System and Security, System, and click “Get more features with a new edition of Windows”). You can also choose to use different full disk encryption software, such as the open source program DiskCryptor.
BitLocker is designed to be used with a Trusted Platform Module, the tamper-resistant chip built in to new PCs that can store your disk encryption key. Because BitLocker keys are stored in the TPM, by default it doesn’t require users to enter a passphrase when booting up. If your computer doesn’t have a TPM (BitLocker will tell you as soon as you try enabling it), it’s possible to use BitLocker without a TPM and to use a passphrase or USB stick instead.
If you only rely on your TPM to protect your encryption key, your disk will get automatically unlocked just by powering on the computer. This means an attacker who steals your computer while it’s fully powered off can simply power it on in order to do a DMA or cold boot attack to extract the key. If you want your disk encryption to be much more secure, in addition to using your TPM you should also set a PIN to unlock your disk or require inserting a USB stick on boot. This is more complicated, but worth it for the extra security.
Whenever you’re ready, try enabling BitLocker on your hard disk by right-clicking on C drive and choosing the “Turn on BitLocker” option. First you’ll be prompted to make a backup of your recovery key, which can be used to unlock your disk in case you ever get locked out.
I recommend that you don’t save a copy of your recovery key to your Microsoft account. If you do, Microsoft — and by extension anyone Microsoft is compelled to share data with, such as law enforcement or intelligence agencies, or anyone who hacks into Microsoft’s servers and can steal its data — will have the ability to unlock your encrypted disk. Instead, you should save your recovery key to a file on another drive or print it. The recovery key can unlock your disk, so it’s important that it doesn’t fall into the wrong hands.
Follow the rest of the simple instructions and reboot your computer. When it boots up again, your disk will begin encrypting. You can continue to work on your computer while it’s encrypting in the background.
Once your disk is done encrypting, the next step is to set a PIN. This requires tweaking some internal Windows settings, but it shouldn’t be too hard if you follow the instructions to the dot.
Click Start and type “gpedit.msc” and press enter to open the Local Group Policy Editor. In the pane to the left, navigate to Local Computer Policy > Computer Configuration > Administrative Templates > Windows Components > BitLocker Drive Encryption > Operating System Drives.
In the pane to the right, double-click on “Require additional authentication at startup.” Change it from “Not Configured” to “Enabled,” and click OK. You can close the Local Group Policy Editor.
Now open Windows Explorer, right-click on drive C, and click “Manage BitLocker”.
In the BitLocker Drive Encryption page, click “Change how drive is unlocked at startup.” Now you can choose to either require a PIN while starting up, or requiring that you insert a USB flash drive. Both work well, but I suggest you use a PIN because it’s something that you memorize. So if you get detained while crossing a border, for example, you can choose not to type your PIN to unlock your drive, however you can’t help it if border agents confiscate your USB flash drive and use that to boot your computer.
If you choose to require a PIN, it must be between four and 20 numbers long. The longer you make it, the more secure it is, but make sure you choose one that you can memorize. It’s best if you pick this PIN entirely at random rather than basing it on something in your life, so avoid easily guessable PINs like birthdates of loved ones or phone numbers. Whatever you choose, make sure you don’t forget it, because otherwise you’ll be locked out of your computer. After entering your PIN twice, click Set PIN.
Now reboot your computer. Before Windows starts booting this time, you should be prompted to type your PIN.
Finally, open User Accounts to see all of the users on your computer, confirm that they all have passwords set, and change them to be stronger if necessary. Disable the guest account if it’s enabled.
FileVault, Apple’s disk encryption technology for Macs, is simple to enable. Open System Preferences, click on the Security & Privacy icon, and switch to the FileVault tab. If you see a button that says “Turn Off FileVault…” then congratulations, your disk is already encrypted. Otherwise, click the lock icon in the bottom left so you can make changes, and click “Turn On FileVault…”
Next you will be asked if you want to store a copy of your disk encryption recovery key in your iCloud account.
I recommend that you don’t allow your iCloud account to unlock your disk. If you do, Apple — and by extension anyone Apple is compelled to share data with, such as law enforcement or intelligence agencies, or anyone who hacks into Apple’s servers and can steal its data — will have the ability to unlock your encrypted disk. If you do store your recovery key in your iCloud account, Apple encrypts it using your answers to a series of secret questions as an encryption key itself, offering little real security.
Instead, choose “Create a recovery key and do not use my iCloud account” and click Continue. The next window will show you your recovery key, which is 24 random letters and numbers. You can write this down if you wish. The recovery key can unlock your disk, so it’s important that it doesn’t fall into the wrong hands.
Once you click Continue you will be prompted to reboot your computer. After rebooting, FileVault will begin encrypting your hard disk. You can continue to work on your computer while it’s encrypting in the background.
With FileVault, Mac OS X user passwords double as passphrases to unlock your encrypted disk. If you want your passphrase to survive guessing attempts by even the most well-funded spy agencies in the world, you should follow the instructions here to generate a high-entropy passphrase to use to log in to your Mac.
Go back to System Preferences and this time click on the Users & Groups icon. From there you should disable the guest account, remove any users that you don’t use, and update any weak passwords to be strong passphrases.
Unlike in Windows and Mac OS X, you can only encrypt your disk when you first install Linux. If you already have Linux installed without disk encryption, you’re going to need to back up your data and reinstall Linux. While there’s a huge variety of Linux distributions, I’m going to use Ubuntu as an example, but setting up disk encryption in all major distributions is similar.
Start by booting to your Ubuntu DVD or USB stick and follow the simple instructions to install Ubuntu. When you get to the “Installation type” page, check the box “Encrypt the new Ubuntu installation for security,” and then click Install Now.
On the next page, “Choose a security key,” you must type your encryption passphrase. You’ll have to type this each time you power on your computer to unlock your encrypted disk. Again, if you want your passphrase to survive guessing attempts by even the most well-funded spy agencies, follow the instructions here.
Then click Install Now, and follow the rest of the instructions until you get to the “Who are you?” page. Make sure to choose a strong password — if someone steals your laptop while it’s suspended, this password is all that comes between the attacker and your data. And make sure that “Require my password to log in” is checked, and that “Log in automatically” is not checked. There is no reason to check “Encrypt my home folder” here, because you’re already encrypting your entire disk.
And that’s it.
Correction: April 27, 2015
This post originally gave an incorrect date for when the TrueCrypt project was shut down.
Correction: April 29, 2015
This post originally said that USB ports have direct memory access (DMA), but this isn’t true. FireWire, ExpressCard, Thunderbolt, PCI, and PCI Express all have DMA.
Correction: May 1, 2015
This post originally said that BitLocker was included in Windows Vista and Windows 7 Pro editions, but it is only included in Ultimate and Enterprise editions for those versions of Windows.
I use a Chromebook, with obviously Chrome OS. Any advice for it?
Veracrypt I say.
Also there are solutions against cold boot attacks nowadays.
nice one. i finally got around to reading it and it’s about as thorough as i hoped (every time i thought to add something you addressed it a few lines later).
one thing i will add if no one has already: that passware “solution” is $995. every time you see a program that promises safety and costs a shit ton of money, do a fast search on torrent sites. passware for example gives back 14 results that are currently seeded. so for $995 you get a “safety solution” that can be cracked by (i’m assuming) some teenager in russia or china or ???. if the underlying code isn’t secure how can its “protection” be such?
as you said, it comes down to human error or lack thereof. or, as i’ve heard in various IT positions, “PICNIC”. problem in chair, not in computer.
As a technical ignoramus I have three technical questions about TPM chips as currently shipped:
1) Apparently they contain one or more encryption keys permanently burned in. The chips are designed to lock out inspection of these keys, but of course anyone in the chain of custody up to the printing of the chips would have access and potentially copies. Unless you think Gemalto was a fluke, this means that the Chinese, Israeli, and US spy agencies have copies. Quite possibly, organized crime get their hands on copies too. What would the consequences be for the end user?
2) Apparently they contain a number generator that’s supposed to generate nonrandom numbers that are random enough for civilian use. A few years ago the NSA twisted some arms behind the scenes at Research in Motion to make their random number generators slightly less random, so it wouldn’t be surprising if the NSA has been twisting arms elsewhere as well. Since the TPM is designed to take an enormous amount of resources to inspect, and there’s no way of verifying whether numbers are truly random, are there any software developers out there that use that part of the TPM nevertheless?
3) Can the encryption module on the TPM be gamed to leak sensitive information? A lot of people with infinitely more computer security knowledge than I have, have been saying that leaking sensitive information about the end user is part of the whole point of pushing Trusted Computing.
When a stranger gives you a gift, and you aren’t allowed to inspect the contents of that gift, the probability of that gift being a Trojan Horse increases asymptotically towards 1 as time passes.
The fact that the TPM is secured against the end user is great if your only worry is that the kids two houses down from you will steal your computer and drain your bank account using identity fraud. But that shouldn’t be your only worry. Centralized control of everyone’s private data is a huge danger because governments and corporations are fallible and prone to making mistakes.
About the Trusted Platform Module: When I read up about the “Trusted Computing” concept this evening, I got the feeling we’re standing on the edge of a terrifying precipice. Here are a few links:
https://en.wikipedia.org/wiki/Trusted_Computing
https://en.wikipedia.org/wiki/Trusted_Platform_Module
https://www.gnu.org/philosophy/can-you-trust.html
http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html
The printing press made large-scale democracy viable; Centralized control of all electronic documents will undo that, and create a real-life “memory hole”. Remember Kindle remotely erasing copies of George Orwell’s 1984?
If I may…: Buy up as many varied, cheap, in-person used machines (laptops are portable) from before the *bridge era (not advocating intel over amd just generally speaking… avoid anythung uefi/etc on for sure). Let’s say the mid-2000s, which should be just enough power, but probably cap that at about 2007… No firewire, thunderbolt, pcmcia. usb is fine (and upgradeable to 3.0 if need be). Make sure stuff like graphics etc are not onboard. Take out and replace the ram. take out any bt and wifi boards. grab and flash all bioses. and hold on to them. Thrift stores used to have used older desktops for under 50$, no idea the going rate now.
Thank me in a few years when you will need these the most.
No. I’m definitely not going to do that.
First of all, I don’t really need that kind of security since I’m not playing tag with major international spy/intelligence organizations and if I were, I would use the “Harry Never Holds” security model aka Hot Potato. The thing here is, that they are trying to create a “Digital Fortress,” which isn’t going to work. There is no such thing as perfect security. The real question whether it is secure enough for what you need and for AS LONG AS you need. There is a timeliness factor in intelligence work.
Open Source Security is the way to go. It spreads the cost around. Another thing is that Encryption is by nature a defensive technology. This really ought to come under the defense budgets of the international community, Maybe we need a BRICS Open Source Security Initiative. The Russians could propose it to the China India, and Brazil.
“I don’t really need that kind of security since I’m not playing tag with major international spy/intelligence organizations”
Oh my. All of us – seven billion and counting – are playing tag with state agencies and corporations, whether we realize it or not. There isn’t enough space here to explain all the reasons why the “if you have nothing to hide, you have nothing to fear” argument doesn’t hold water.
“The real question whether it is secure enough for what you need and for AS LONG AS you need.”
In my case, the most sensitive items are my personal contacts, my work-related documents, my financial records, and my health records – all things I need access to fairly frequently. It’s probably a good idea to encrypt any device that might get stolen, but it’s important to remember that encryption only gives very limited protection against anyone who has significant resources.
There are a couple of other things that most people have on their electronic devices but do not realize are highly sensitive if those data end up in the wrong hands, but I won’t discuss that here. Whole-disc encryption gives better security than encryption of specific files anyway, since encrypting single files or folders merely marks them as being interesting to an intruder.
Your life. Hardware and firmware (and oh, don’t get me started on baseband) is how things will be and are getting to be owned.
Have YOU found open-source baseband? firmware? Can you build your own processors? mobos to put them on? Can you write your own firmware or baseband that’ll work on a network? Code your own OSes that’ll boot on hardware that now (like some of the latest hardware) has a 64bit processor but a 32 bit bootloader?
Who knows. Maybe one day you will have secrets. If not, then you are either very unfortunate, very boring, or not human.
The fbi is the most dangerous, murderous and sinister group of thugs in usa. My experiences:
http://austin.indymedia.org/article/2015/04/29/fbi-psychological-operations-psyops-fbi-against-me
Ad “But unfortunately, laptops have ports that have direct memory access, or DMA, including FireWire, USB, and others.”:
As far as I know there is no DMA (direct memory access) for USB, e.g. see:
http://en.wikipedia.org/wiki/DMA_attack
http://security.stackexchange.com/questions/72231/usb-otg-dma-attack-vector-any-information-about-that
So you don’t have to destroy your USB ports and you are not prone to an DMA attack on a common Windows notebook (if doesn’t have other vulnerable ports).
You’re right, someone else pointed this out as well and I confirmed and posted a correction.
BitLocker is not included in the Professional edition of Windows 7, only Ultimate and Enterprise.
But Microsoft source code is not really closed source. It is made available to qualified customers, enterprises and GOVERNMENTS (read Five-Eyes)!!
https://www.microsoft.com/en-us/sharedsource/
What this means, is that while you and I cannot audit the code, the NSA and GCHQ can review the code for vulnerabilities!
Simplified timeline:
1. TrueCrypt developed and released by people unknown.
2. TrueCrypt widely used and trusted around the world.
3. TrueCrypt talked about in popular press and promoted as a security tool.
4. Development continues. Developers toil for free.
5. Articles posted that claimed that law enforcement could not break truecrypt and it frustrated them.
6. TrueCrypt widely used and trusted around the world.
7. Consulting company decides to raise money to audit TrueCrypt because its not known if its safe. (Allegedly 3rd party proposes crowd sourcing audit, has a consulting company that is willing to help)
8. Consulting company raises lots of money.
9. Popular press warns of possible security problems with TrueCrypt pending investigation
10.Presumed devs threw in the towel, said the code was compromised and were never heard of again.
11.Consulting company says everything is good. (A couple of bugs)
12.TrueCrypt no longer being developed.
13.TrueCrypt no longer trusted .(?) Less widely used.
14.No trusted replacement identified.
15.Popular press encourages use of Windows BitLocker. (which is only available on some versions of 7 and 8) Microsoft BitLocker is closed sourced from a US company alleged by some to work with the NSA occasionally. Or OS/X Filelocker also closed sourced from an American company.
16.Law enforcement feels happier. (?)
I am surprised that The Intercept would hold the opinion that unless a system is definitely proven to be compromised
the public at large should not question the system. Especially using CIA seeded terms (1) like “conspiracy theory” to shame
readers who question US companies known to work with the NSA on occasion.
[1] http://www.zerohedge.com/news/2015-02-23/1967-he-cia-created-phrase-conspiracy-theorists-and-ways-attack-anyone-who-challenge
Oh but you left so much out, especially at around #10.
Your point is well made about BitLocker. Why would anyone assume that they did not receive a National Security Letter? On the other hand, if we assume that someone involved with TrueCrypt received such a letter, we would still have the source code to cross reference.
This sort of morphed into a TrueCrypt thread. The real problem with TrueCrypt or whatever replaces it is that there is no support for real Open Source Security software. What we need is a political solution. If corporations won’t step up, maybe a few countries might.
Several smaller countries like China, Russia, Venezuela, Vietnam, could fund a decent software project out of their defense spending. Give the Open Source Security movement an annual development budget of $30 to $40 million(between $1 to $5 million from each country) and the entire world could have cyber security. Each country could then “roll their own” security solution.
From a developer perspective TruCrypt is a nice starting place because it is cross-platform. Writing software is tricky enough, making it work cross-platform and internationalizing it takes a lot developer time and effort.
Oh I have nothing against you, but that is so dangerously, dangerously, DANGEROUSLY naive.
Info on Linux is wrong you can encrypt post Installation even with GUI tools
How? Do you have a link to a blog post explaining this or anything?
And here’s Bruce Schneier last month on Bitlocker:
“But who knows? We do know that the FBI pressured Microsoft to add a backdoor to BitLocker in 2005. I believe that was unsuccessful. More than that, we don’t know.”
https://www.schneier.com/blog/archives/2015/03/can_the_nsa_bre_1.html
for background on the FBI story:
http://mashable.com/2013/09/11/fbi-microsoft-bitlocker-backdoor/
Of course, this sort of thing is really more the NSA’s, not the FBI’s, turf:
http://www.theguardian.com/world/2013/sep/05/nsa-gchq-encryption-codes-security
For what it’s worth, Bruce Schneier on the recent security audit of Truecrypt:
“Nothing that would make me not use the program”
https://www.schneier.com/blog/archives/2015/04/truecrypt_secur.html
It’s worth noting that Chromebooks automatically have disk encryption enabled but are also set to automatically back up all your data in Google’s cloud. You’d need to use only offline apps and turn off all cloud backups to keep Google’s hands off your data.
There’s no good reason to run chrome on a chromebook. Use linux. There are a bunch of full featured lightweight distros sporting ease of use and a number of well-regarded blogs providing instruction on how to do so.
Peanut, you have gainsaid mchuge’s assertion, (although in truth you meant to say “even better”) – now it would be helpful if you outlined the automatic disk encryption you imply is built in to (admittedly fairly secure) Linux, and outlined its strengths. This is news to some of us.
I didn’t mean “even better”, just like I wouldn’t with Android. He seems smart enough to be able to google what I mentioned. Most people could also just try out a live boot OS. My other comments have gone into linux. But if you or he or snyone else wish for a step by step I can dig up relevant links directly or partition off some time to write an exhaustive step by step.
Yes, easy enough to boot a chromebook into Linux. If you stick with the chrome OS, as long as google has your gmail password, they have your disk encryption key. You can change it offline and use google apps offline if you need to. The next time you go online, you’d need to jump through a couple hoops to get gmail / google to recognize you.
Ecosystems that consider you the product not the customer, and build those services into the OS, as well as any ecosystem using any sort of cloud (looking at you now Adobe and Office) is inherently insecure. Get away from google apps baked in. If you must use google there are other ways.
btw this is one reason I said to be wary of ubuntu.
“BitLocker, which is Microsoft’s disk encryption technology, is only included in the Ultimate, Enterprise, and Pro versions of Windows Vista, 7, 8, and 8.1″
WRONG. Bitlocker is not included on Windows 7 Pro. Enterprise only.
I think its far better to use Linux with luks for general use, everyday activities and to use Tails live USB drive for when you have to get serious
Thanks for the info.
I look forward to the day when we’re able to counter attack. Something along the lines of possessing the mouse, phone, any nearby electrical or even inanimate object and then smack the shit out of whoever is responsible. (bonus points for attaching the first 20 seconds or so of Beethoven’s Ninth to that reply)
Magic 8 Ball siggests prison as outcome. Not fuzzy. Better to not think of asking again later. Protest, physically, peaceably, and en-masse now instead.
Please don’t say “you don’t have a right to privacy while crossing the border”. You always have that right, as it is an unalienable human right. You don’t stop being a human being at a border, even though our dehumanized governments would certainly find that a convenient proposition. You don’t stop being a human being when they don’t treat you like one. Your right exists even in circumstances where it is not being recognized by your government. Philosophically though not practically speaking, that’s a whole different matter altogether.
Maybe this is a cultural misunderstanding? I’m from Europe.
you don’t have a right to privacy in the eyes of the government, not as a human being.
This article is dangerous disinformation and needs a rewrite sorry. I would expect better from The Intercept.
1) If you’re using Windows or Mac, you’re wasting your time with their default closed source encryption if you’re trying to defend against state level agencies e.g. the Five Eyes spy apparatus. The code is unauditable and you don’t know what it’s really doing.
2) The article dissuades use of an audited encryption program where the audits only uncovered fairly minor issues in the grand scheme of things. Sure new issues won’t get fixed. However if other projects (VeraCrypt, Ciphershed etc) take the audited TrueCrypt 7.1a code, improve that in a sensible manner to fix the issues and have the code re-audited then this is perfectly fine. It’s definitely orders of magnitude better than using proprietary, closed source offerings.
3) Cascade encryption. TrueCrypt is the only program offering this. A few months ago there was a slide in Der Spiegel about how the NSA have internal only cryptanalytic techniques against block ciphers, mentioning AES specifically. Anyone using a single cipher to protect their data from NSA is a naïve fool. Also anyone relying on NIST cryptography standards. The whole point of NIST cryptography competitions is to pit academia’s best cryptanalysts against the NSA’s. Then at the NSA they compare the results of the algorithm’s public and internal cryptanalysis. Then NSA choose the winning cipher to be an algorithm which appears strong for the public, but that the NSA have secret advanced attacks against. The public then unwittingly adopt the weak standard and use it for everything. Enigma was thought to be still secure 40 years after winning WW2, just like AES is thought to be still secure today. Cascaded stream ciphers however protect against breaks in either algorithm. The extra layer of encryption removes the chance of known or chosen plaintext attacks because the next layer is effectively a random bitstream, thus making cryptanalysis practically impossible.
Micah, you shoukd have gotten three people to do these – one of each of the best people in the privacy amd crypto feld who use these OSes and knkw their internals and crypto implementation. There aren’t many out here but most would be happy to write this sort of article. Then maybe realise OSX and Win can never be secure… but at least you have people who know the OS for real. IIRC you are a mac user, right? But you didn’t even mention the alternatives to ANY of the built in FDE for even OSX.
Many are the ways that you can do just about anything in Linux — including encryption. You most /certainly/ can encrypt an already installed system.
Even if you do, by not instantiating it by filling the drive with random data first, you’re blundering. Live boot. Departition completely via fdisk or cfdisk. Cat /dev/urandom into the raw device til it exhausts (can take up to a day on larger disks). Use live boot to install.
No mentions of having a firmware password. It’s critical. This will prevent a live OS from booting on your machine. If you take measures to render your drives useless when removed, ie FDE (or thermite, haha, only sort of joking), a firmware password will make thing incredibly difficult for an attacker.
Thanks for recommending backdoored operating systems and software.
1. There’s no evidence that BitLocker or FileVault are backdoored, just speculation and conspiracy theories. If evidence of a backdoor existed, I certainly would mention that in this article.
2. I’m not recommending any operating systems. I think people should use free software operating systems because the more people use them the better they’ll get, and I believe that free software is incredibly important to promote. But most people use Windows and OS X, and it would be silly to tell them they have to switch to Linux if they want disk encryption when disk encryption is built-in to those OSes.
As @thegrugq puts it: “We can secure the things people actually do, or we can tell them to do things differently. Only one of these has any chance of working”
hey, why didnt you recommend Veracrypt ?
I would never use microsoft’s bitlocker for various reasons.
before Snowden, saying that “skype and facebook are in bed with NSA” was a “conspiracy theory” too.
anyway, thx for the art ! except for recommending bitlocker, lots of useful inside ;-)
Micah, that’s disingenuous. Few examples of well-implemented crypto exist and no, repeat NO closed-source crypto should EVER be trusted. While even open source crypto (eg dmcrypt, luks, tc, etc) can have flaws or bugs, the longer things exist the more eyes see it, and the less likely it has gaping flaws until you have amateurs working on it (one reason I would never trust the tc replacements, cryptocat, etc) or people deliberate inserting tricky code that would pass most peoples’ (including crypto peoples’) scrutiny that actually have ‘AClue’ ™.
Now take away any of those and you have situations where they don’t even need to be clever. In a paranoid country verging on a police state, via companies well known to have either cooperated or been coopted.
No, the illusion of security is NOT better just because it is accessible.
Oh and you forgot to talk about /boot in linux too (fwiw).
It is good people are having attention drawn to encryption… and frankly there is indeed a case to be made for even likely compromised encryption (depends on the threat actor but that sets a horrific precedent… the only possible temporary upside is that it just makes the use of encryption itself stand out less if everyone does it (tho it makes it that much more likely to be attacked, insinuated into, and backdoored too, so yeah… not good either)).
But then… you advocated Ubuntu not even something slimmer like Mint, and Ubuntu wants to be a Real Boy, so problems that way lie also. Korora and a thumbdrive for boot… and never leave either out of your sight… and encrypt your data separately, on a different partition (maybe external media), with keys, and a different passphrase. It is a lot easier than it sounds. But you know that Micah.
“The less likely it [the open source code] has gaping flaws”, absolutely true – but the more likely that someone with a large budget or too much time can and has used the published code to find at least one of those flaws – since they do have the code, after all. As a general law, the larger the open source code (and less stable), the more certain that errors rest where no, or almost no, eyes have been. Therefore, as the years pass, it becomes more certain that state actors can defeat open source projects at will, with a sufficiently massive budget. We could call that Nom’s Law. Closed source is more likely to have flaws, but not as vulnerable to deep pockets or those with too much time on their hands. Idiosyncrasy has its own rewards. What often defeats closed source isn’t that it couldn’t be superior, but that others can’t see you cut corners, so you’re more likely to cut corners and simply cheat the customer of the competence they think they’re paying for.
I would argue that while you are correct vis sloppiness and the other perils of closed source…
… (and I hinted at this; as suspected I didn’t get a reply from Micah Leeand did not want to get exhaustive unless a dialogue were established (though also, what would be the point? one has to start somewhere and at some poimt things get too crunchy and people tune out – I may have already succeeded at that – sorry, long digression)…
you seem to be implying the Big They would require fuzzing and fancy attacks on a blackbox system when the
oops, continuing… when the more likely scenario is the Big They would be able to get copies of the ‘closd source’ crypto through either the front door, the back door, or a side channel (a hacked employee, a former employee, whatever… point is you can generally assume at least in the US’es case, they’ve got all of the major providers source code. Even, and this beggars belief, if they didn’t they have thousands of enployees and millions++ in funding (++(++)) and inarguably the vast majority of higher maths grads in the country who either started there or stayed there (and by there I mean No Such).
I am very picky about my open source. Anybody should be. But honestly few people have the technical ability or even the time to audit everything. This is why you find a trusted product in this arena, stick with it, and heavily dig into code diffs. Rolling release cyckes are a horrible thing for crypto projects especially ones involving a large number of “volunteers” (with varying levels of qualification… just wanting to help doesn’t make someone capable of helping, people can have their boxes hacked and commits pushed surreptitiously with a mangled regular expression injected, some may be involved to DELIBERATELY insert a ‘very tricky bug’ that even most pros would:c catch. The same problems exist on the closed side too. So really it’s a choice between A and B or only B in the set of potential problrms
btw state actors already have feet in the door of most if not all worthwhile open source projects… but you… you I suspect either kniw this too, or at a minimum suspect it.
Anyone trusting BitLocker is a spoon short of full canteen. It’s not open source and Microsoft is based in the US.
The beauty of TrueCrypt is that the encrypted container file can be hidden from view. Only the user needs to know it exists and where it is stored on the hard drive. With a TrueCrypt file hidden, a laptop owner can cross the border without raising undo suspicion because the computer itself is not encrypted. Border security could clone the whole drive, but a sector by sector scan will reveal a section of the drive with encrypted nonsense. There is no way to knowing the true content unless you know the file name and the password with which to mount the hidden file as a virtual drive. Even if TrueCrypt is not supported and there is the potential threat of yet-to-be-develped exploits or hacks, if you do not know the file exists, you can’t hack it.
Better yet, store your encrypted data in a TrueCrypt file on a micro SD Card. A 32GB or 64GB fingernail sized device — my baby finger — is cheap and easy to hide or mail. There is no reason why you could not email yourself the encrypted file and send it across the border electronically or simply store it in the cloud and retrieve it when needed.
Technically while somewhat true not completely true. Which is why hidden containers are often best kept small and additionally used with stego (steganography). I have no relation to this site, but this link (or EnCase but bah commercialism, etc) gives some hints as to how it might be performed: http://www.brimorlabsblog.com/2014/01/identifying-truecrypt-volumes-for-fun.html … That said stego has problems too, especially if it’s too much noise in a legit file. Smattering them through a TB of audio files, though, would infuriate people, especially if you systematically do it with all of them but only a few have real data. :)
This is valuable information, peanuts, but doesn’t contradict the above advice re truecrypt on HDs that are themselves fully encrypted. UNLESS such encryption leaves directories unencrypted, and if true, this would be worthy news to report. (But all this also falls under the article’s caveat that this encryption is aimed at proximity attacks only NOT remote attacks.)
Steg is often considered “one precaution too many” since it alters the photos, etc in ways that are too-easily detected (for randomization in low digits for example) with inexpensive checks, defeating the purpose and betraying the existence of encrypted information. It is sufficiently poorly regarded that it may encourage an attacker to spend more time and effort attacking you, on the theory that you may not know what you’re doing, only think you know what you’re doing.
Steg shouldn’t be done in photos anyway, and yes stego has some flaws. My point too was to mention this is a problem after your entire drive has borne scrutiny by a large federal agency. The TSA doesn’t get that far (yet) so it is reasonable to use audio files for thus use case. Not every use case. My pount was that using hidden containers to some actors is an even larger signal of ‘insidious intent’ if noticed. They aren’t a magic bullet. If you’re intetesting, so to speak, you need to expect a high level of scrutiny. Using hodden containers can be useful but an advanced analysis will be more likely to catch you out in a lie if you do it wrong ™.
Re last paragraph, thumb drive – and if you wish an extra layer of security, put only a large random one-time pad on the thumb drive, and if only if that gets through unexamined, use that one-time-pad as the final decrypt (simple XOR) for a file sent by email or cloud whose final encryption was a (previous) XOR of that same one time pad. You will be able to tell customs that the file on your thumbdrive just random numbers and has no password and can’t be decrypted, because that’s so. If it doesn’t get through unexamined, send another or rely on the cloud sans the extra layer of encryption.
gpg conventional and dual sets of strong randomly geberated, long OTPs are one of the things I suggest to people who have the opportunity to meet in person before corresponding for emails.
Either you trust your platform vendor — whoever they are — to make a good, reliable, safe platform that is resilient to attack, or you don’t. Hair-splitting on things like “Well, I don’t trust their disk encryption, but I do trust their browser not to have an exploitable use-after-free bug and a sandbox escape” is silly, pointless, and not really something that a non-super-expert can really determine.
The other thing is, what featureset does each solution support? There’s disk encryption (confidentiality), both partial and full (e.g. encryption of data but not metadata at the filesystem layer, vs. encryption at the block device layer) — but that’s not the same as data integrity (to which Micah alludes when mentioning the evil maid attack and in BitLocker’s support for secure boot).
We really need full confidentiality and secure boot. (As well as a host of other platform security features: https://noncombatant.org/2015/01/02/platform-security-features/)
Finally, don’t be so sure that open source is more auditable than closed source. Consider e.g. Zvi Gutterman’s work:
https://eprint.iacr.org/2006/086.pdf
“””Although the generator is part of an open source project, its source code (about 2500 lines of code) is poorly documented, and patched with hundreds of code patches. We used dynamic and static reverse engineering to learn the operation of this generator. …”””
https://eprint.iacr.org/2007/419.pdf
“””We examined the binary code of a distribution of Windows 2000, which is still the second most popular operating system after Windows XP. (This investigation was done without any help from Microsoft.) We reconstructed, for the first time, the algorithm used by the pseudorandom number generator (namely, the function CryptGenRandom). We analyzed the security of the algorithm and found a non-trivial attack: …”””
Reverse engineering can even get you human-readable documentation:
http://blog.zynamics.com/2011/01/21/recovering-uml-diagrams-from-binaries-using-rtti-inheritance-as-poset/
Conspiracy theories are fun, but the facts on the ground — even to the limited extent we know them — are wayyyy more complicated than “open good, closed bad, the end.”
Chris, small point, not really disagreement: “Either you trust your platform vendor — whoever they are — to make a good, reliable, safe platform that is resilient to attack, or you don’t. Hair-splitting on things like “Well, I don’t trust their disk encryption, but I do trust their browser not to have an exploitable use-after-free bug and a sandbox escape” is silly, pointless, and not really something that a non-super-expert can really determine.” That’s exactly what I said while advising gmail-to-gmail correspondence as a best-effort, before Snowden (actually, before hearing of a different gmail flaw it would be better not to discuss here, somewhat before Snowden.) Stupid me. But your point remains true for nearly all of us.
Thanks for your info and links. Good, concise stuff.
Why doesn’t someone sell a laptop with an *ERASE* button? You know, something like a nuclear launch button that you can’t it by mistake but when actuated it will totally erase your HDD forever, in seconds. A degaussing coil built into the HDD would serve this purpose nicely.
There is obviously a market for this. Also a tamper proof machine that would self destruct (erase) if it detected attempts to gain illegal access.
And like others I don’t think anything offered by Redmond can be considered ‘safe’. For sure though, this will protect you from punks and county cops.
When drives were like 20MB and it was the early 90s maybe. But even if you nuke the MBR things can still be recovered… and have you ever done a DoD-level wipe on even a 20GB drive? A drive larger than a terabyte drive or so can take days. Encryption helps a bit but most people keep emergency copies of the header since encrypted data on a truly properly encrypted drive if corrupted SHOULd stop working. Most people would cough those up. And even if not, cold boot attacks have nothing to do with your hard drive.
“Why doesn’t someone sell a laptop with an *ERASE* button?”
Because it would be suicidal for anyone to have such a toy on their person.
For years already, US border protection officers have been trained to be on the lookout for exactly that trick.
And in general, law enforcement is under orders to view the destruction of evidence as compelling evidence of criminal activity.
There are self-destruct hard drives you can buy that do this but they are not cheap.
some of us may have modded our bootup sequences to have more than one possible passphrase trigger…. not me of course, just saying.
Thanks for the great article. I’ve been using LUKS for a while now, has restored some sanity to my life.
I wish you discussed more how to create a bootable USB which can decrypt and boot the LUKS drive (evil maid prevention), that sounds like the next step for my rig.
Greetings Citizen,
Assuming you know how to access BIOS and switch your computer to boot from usb.
Upon installing any distro, there is an option to partition manually, simply click that option and place your “boot” on a removable usb. After the installation, plug the usb that contains the boot, enter the pass-phrase and on you go.
You must always remember to know where that said usb is.
Friendly Greetings,
Worthwhile. However, note that you aren’t *necessarily* protecting yourself from a government this way, esp if you bought your computer from amazon, because you may have a very *special* bios.
s/from amazon/online or via mail order, or in any way not bought randomly in person from a store with an pseudoanonymous prepaid gift card or cash/
though apparently pressure cooker purchase footage is kept, like, forever. but point is if you are interesting, get it in person.
I saw 3 things (one of which was this article) that made me think today:
1) John Oliver was making fun of Dr Oz’s misinterpretation of the 1st amendment on Last Week Tonight. John Oliver pointed out that the 1st amendment means that the government cannot censor your speech.
2) Glenn Greenwald put out this tweet:
From the administration that has prosecuted more sources for journalism than all prior administrations combined
Glenn Greenwald added,
Susan Rice @AmbassadorRice
This Administration believes deeply in the importance of free and independent global media. Will continue to support. #FreethePress
3) I read this article talking about the steps you have to take to secure your laptop. And even with a degree in computer science and a certification in computer security, I find these steps nearly impossible to do correctly.
But here is the point that I want to make. Let’s say you are working on a book that criticizes the government and the book is unpublished and resides on your laptop and people you interact with start repeating back (demonstrative surveillance) the contents of your unpublished work-in-progress book? Can you draw any conclusion other than the 1st amendment being completely dead? Why would The Intercept turn a blind eye to this? Why would the US courts turn a blind eye to this? Susan Rice is so completely full of shit it isn’t even funny. Yes the government can escape culpability by falling back on the defense of plausable deniability, but that isn’t good enough when you are destroying the 1st amendment of the constitution.
And when I say these steps are nearly impossible to do correctly, consider a 4th story that was in the news today: The White House email was hacked into by the Russians and they read all of the email including President Obama’s email. If that system can’t be secured, how is a private citizen supposed to effectively guard the contents of their laptop with 8000 different ways to hack into it possibly including back doors that may have been added by companies at the behest of the government?
No, the White House SAID ‘Russian Hackers’ did something that may or may not have happened by people who may or may not have anything to do with it, because they realllllly are pushing Russophobia/xenophobia right now. Just saying. Why are you believing anyone’s attributions anyway? For that matter they can do it themselves, using ‘owned machines’ via proxies (proxy corps using proxy servers, often unbeknownst to the machines’ owners) and create all the attribution they want.
That is not scifi. That is reality now. Be cynical when it comes to attribution.
If you actually have a degree in CS and a security cert but can’t follow these steps,. you ought to sue the university and whatever certifying organization for a full refund of any tuition and/or fees. Seriously. Nothing that was outlined in this article is even particularly advanced,. Things like BitLocker/File Vault and whole disk encryption are run of the mill stuff for any reasonably trained or experienced user who has bothered to look and learn.
It’s not that I can’t turn on Bitlocker. It’s that Bitlocker won’t protect you.
You state: “Nothing that was outlined in this article is even particularly advanced”
But you are missing my point:
The article talks about keeping the laptop in your possession and with you everywhere you go. I have also seen advice from Snowden / Greenwald about never connecting the computer to a network. These are not simple steps to follow – which is why I say that it is nearly impossible.
The articles in the media seem to suggest that if manufacturers put in end-to-end encryption it will stop mass surveillance. If this type of encryption was standard on everyone’s computers and communications, that would probably help protect people against mass surveillance. But it won’t help if you have already been targeted. At that point you would have to take more severe steps that are not easy to follow like having the physical hardware in your possession 24:7 as the article points out. This is nearly impossible for a private citizen with a life to do. The White House couldn’t even protect itself?!?
Furthermore, taking steps to evade surveillance actually seem to invite more surveillance. There were stories about how the government was searching for people trying to use TOR. So unless everyone adopts it, you may actually be making matters worse by flagging yourself as someone who needs surveillance. And the steps that you have taken to evade surveillance probably didn’t work.
While nowhere near as good, you’d do yourself even a slight favour removing the wifi/bt chip on-board and getting usb replacements that are only connected when in use and are mac spoofable. But really you are confusing things that should never be online and things that can sometimes be online – and which you shouldn’t intermingle storage drives between either. You’d also do yourself a favour to live boot if you absolutely cannot airgap a machine completely (with said on-board chip removed and no network drivers installed). But really having multiple devices in a world of sub-200ish dollar linuxable devices is a better option.
Gosh, it’s funny that it’s so hard to buy used laptops from pre-wifi days now! Wonder why that is :)
Tiny stick computers are a fast option, not all bad. They’d have to hit your monitor with a proximity attack.
Vs Person – your aim, here, shouldn’t be to protect yourself against your government for all the reasons you give. Your primary aim should be to protect yourself against remote attacks from random actors and foreign governments who are using large nets, bottom trolling – that will be increasingly common as time goes on (thanks again, NSA and five eyes.) Even a clumsy airgap raises the cost to play enough to keep you free of nearly all of those risks. It isn’t immediately obvious that there’s another computer (but your patterns of copying will show that if someone goes to a lot of trouble), and soon every citizen will have at least one airgapped computer if only so they don’t have to pay a blackmailer to all get their baby photos back (mere external storage isn’t a safeguard even from that, and your cloud password isn’t safe as things stand, either.)
Oh yeah.
Don’t get me started on the new chip families, nomentanus… It’ll devolve into a ragefest. :/
But let’s thank the proper party for the Russians being able to read Obama’s unclassified emails, banks losing billions, intellectual property being siphoned off to China, and the rest of us finding protection impractical even with a computer science degree. The NSA and five eyes have uniformly blocked secure computing patents and production for many, many decades, believing that only they would ever be able to pull crap. They thought a happy medium could be maintained in which they inhabited a broad plateau of competence no other actors could climb to. So here we are: the nature of computers, and humans, means that this is a “slippery peak” that even the NSA can’t scramble to stay on top of. Either the (undiscussable) innovations that can create real security are finally allowed by the US, genuinely, and everyone is secure(from remote attacks, not proximity attacks), including some bad guys some of the time, or very soon no-one is.
We can do without our freedoms (maybe the slowing of innovation that would result isn’t all bad?) but we can’t do without a banking system.
Start with wiping out the AV industry by protecting computers properly instead of inflating stubs to be “unique malware” and doing away with “rules” which perpetuates the business model? Yikes!
Russia, China, blah blah blah. It’s not a scene, my Latin friend, it’s the arms race that destroys all.
Bitlocker is not supported in windows 7 Pro as the article states – REF: https://technet.microsoft.com/en-us/library/ee449438(v=ws.10).aspx#BKMK_HSRequirements
It IS availble in Windows 8 Pro however – REF: https://technet.microsoft.com/en-us/windows/jj737997.aspx
.
Additional information regarding ‘TrueCrypt’ (Page last edited on Feb 18, 2015)
https://www.grc.com/misc/truecrypt/truecrypt.htm
URL scanned via VirusTotal & URLVoid
.
Truecrypt was my go to for encryption for year. My personal thoughts is that it was so secure the gov could not break it, and the developers were shut up by big brother as there was no back door. I still use it! Also in past where if the OS became corrupt and I needed access to the data, I was able to decrypt the HD in a few days time.
I do not trust BitLocker or other big name encryption software.
The article advices to disable any guest accounts when using full-disk encryption, but enabling a guest account seems useful so if I lose my laptop, someone may come online using it, which offers me a way to find it again. The Apple menu for guest accounts says that when FileVault is enabled, guest users only have access to Safari. Does this mean I can have the benefits of both full-disk encryption and the guest account after all?
Whether you have a guest account or not, in the scenario you describe the good Samaritan would not be able to even reach the login screen which would give them access to the guest account.
The drive encryption passphrase is completely separate from the computer account password. If the user can not decrypt your drive, essentially they can not boot the computer, meaning a guest account is not reachable to anyone who does not know the drive encryption key.
This is unless you leave your computer powered on and the good Samaritan finds it before the battery dies. Then in theory a guest account may be work in this way — but you’re better off just putting your email address on a sticker on the machine so that you are not vulnerable to key theft and the unlikely good Samaritan can still return your device.
Thanks for this.
Actually, I think the guest account CAN be used without a password on my MacBook, even though I have FileVault enabled, as it appears on the login screen that is displayed immediately after I turn on my laptop. This is what Apple says about this: “When FileVault is turned on, guest users can only log in and use Safari. Guests can’t access your FileVault-encrypted disk or create files. Instead, they log in and use Safari from your computer’s built-in recovery disk.” (From https://support.apple.com/kb/PH11321)
So does this mean I can combine the guest account with meaningful disk encryption after all?
By the way, it’s not specifically for Good Samaritans that I’ve enabled the guest account. The point is that anyone who comes online using my computer would be traceable through a service like this: https://www.backblaze.com/lost_computer.html. I think Apple offers something similar, if you use iCloud.
Thanks for your response. This is actually the second time I’m responding to it, it seems my first response mysteriously disappeared in the vetting abyss.
It actually IS possbile to use FileVault and have a guest account. In that case, the guest account only offers access to Safari. This is what the Apple support site says: “When FileVault is turned on, guest users can only log in and use Safari. Guests can’t access your FileVault-encrypted disk or create files. Instead, they log in and use Safari from your computer’s built-in recovery disk.”
Does this mean it’s possible after all to have both meaningful disk encryption and a guest account after all?
By the way, I haven’t set up the guest account specifically for Good Samaritans. The idea is that anyone, whether a Good Samaritan, a thief, or otherwise, who gets hold of my computer might use it, thus exposing the location of my laptop to software I can use to track my lost computer.
To anyone wondering what the answer to my question is: it turns out that it isn’t necessary to enable the Apple guest account be able to track a lost computer. To be able to do this, simply enable ‘Find My Mac’ in iCloud, as explained on http://www.peachpit.com/articles/article.aspx?p=2300563&seqNum=2. When combined with FileVault, the computer login screen will automatically be extended to feature a guest login option, which offers access to Safari only. When the computer connects to the internet through the guest account, Apple automatically sends you a notification of its location.
if you have a mac, you’re screwed. esp if you are not heavily modifying its services, and certainly if you use the machine the way you seem to be using it.
That’s VERY helpful Peanuts, thank you! I now understand just how bad a computer user I’ve been, and solemny pledge to learn how to ‘mod’ and ‘dev’, using the command line only, as we all should. I’m quitting my job right now!
Asking what to do would’ve bern fine. You didn’t need to be a sarcastic asshole.
You don’t need to do everything on the command line. Get a good app that lets you modify this sort of thing (cocktail?). Get rid of your thunderbolt port (use epoxy). Let guests use the computer only via a liveboot instead of guest. Having recovery (and that guest access) on the drive itself is de rigeur for apple but risky. Consider switching to Fedora with a highly user friendly user interface (gnome for example… cinnamon perhaps). Should I go on? Or would you rather just assume I was trying to be a prick?
oh and get lil snoop and block the hell out of everything you don’t need connecting out or in. five minutes watching network traffic on a mac and you might notice osx is basically a dirty little slut when it comes to opening its legs. lil snoop btw is a firewall. i would be GLAD to tell you how to configure it properly. just ask.
oh and unless you are running the ultralatest yosemite you are vuln to remote exploitation. apple doesn’t see fir to patch even mavericks users. too much work. fedora and its ilk patch way way faster… and more importantly, if the vendor won’t patch or support, the community might.
Oh and there are NO open source fde crypto options for osx. you have two likely backdoored (and weak) options, generally speaking: filevault and mcafee’s pgpdisk. roulette on which is worse since: closed source.
btw if someone gets hold of your hardware and can’t access the os, chances are they will just reinstall. you may get that idiot who uses it to identify themselves, you may not. . in the meanwhile you are also yourself walking around with what is essentually a personal tracking device by having FindMyMac and its ilk enabled. Maybe that doesn’t bother you though. I fully accept that possibility. I personslly find it creepy but I find foursquare checkins, cellphone gps tracking and supercookies creepy too.
Since I am the furthest thing from elitist and I don’t wish you insult you by not linking you to info on that magic rootpipe thing: http://www.forbes.com/sites/thomasbrewster/2015/04/09/apple-leaves-rootpipe-backdoors-in-3-per-cent-of-all-pcs-on-the-planet/
further reading and complexities:
http://www.reddit.com/r/netsec/comments/32g9jc/how_to_fix_rootpipe_in_mavericks_and_call_apples/
rootpipe vuln info 1 (i think the comment system is blocking my url posts):
http://www.forbes.com/sites/thomasbrewster/2015/04/09/apple-leaves-rootpipe-backdoors-in-3-per-cent-of-all-pcs-on-the-planet/
rootpipe vuln info, analysis, issues, discussion, possible kludgy mitigation, etc: http://www.reddit.com/r/netsec/comments/32g9jc/how_to_fix_rootpipe_in_mavericks_and_call_apples/
point being if your preferences make you non-secure (or worse) then you can choose tignorance (not an insult, mean this in the dictionary sense), choose not to care, choose to mitigate as best you can, or choose to do far less work but take the hit of a bit of adjustment time and modify your preferences. There is no magic bullet. You can’t have security and an os that acts as its own nanny state and commercial enterprise based on your data.
I’d almost thank you, as this may actually be of use to someone, but you’ll find it helps not calling people ‘asshole’ when you want them to take your advice, especially after their sarcasm may have had something to do with the condescending tone you take.
Now as for your actual advice, I’m aware a lot of bad things can be said about the security and privacy policies of the Apples of this world. There’s obviously a trade-off between those issues and ease-of-use and it seems so hugely stacked in favour of the latter, that I’ve opted to stay in that corporate framework. But I do keep feeling uneasy about it, and try to at least mitigate the downsides, like deleting cookies, steering well clear of foursquare, using an anti-tracking browser plug-in, etc. At some point, I may find the motivation to go further, but for now it seems like the benefits of that don’t weigh up to the considerable effort required. I’ve looked into Fedora before though, so who knows, one day …
Only I wasn’t being condescending. I understmand that it might be easy to see my response through the filter of “OS Wars”, but I generally find those sorts of people irrelevant since usually their feelings about the OS come into play instead of the practicalities and most of them have no knowledge of any OSes underlying concepts and weaknesses. Pointing that out isn’t an insult, and many people have experience in other things I do not. I would certainly never, for example, get into an argument about golf or fashion (though I might about the physics of golf, or the sociological/psychological/anthropological/economic aspects of fashion). As much as I want to understand the world about me there are some things that I would really really need a reason to get me interested in learning about it, and not all of those reasons have to do with a paucity of time.
I apologise if it seems like I was insulting you but your tone was rather distressing, especially as I have spent hours on this post giving freely of my (professional) time, knowledge, and experience not because of any desire for thanks (and I use this username nowhere else; I don’t care about any of that) but pretty much because I find this article to be dangerously lax for something that claims to “mean it”.
I was probably too terse. The problem is you are putting yourself in a situation where you want to run OSX .and use Apple hardware (and both have their elegances at times) but at the same time that means to be anything like secure you need to learn so so so much and even then you still can be outsmarted (as could any expert; we can only aspire to be our best, and nobody is perfect, at anythung, but when it comes to security, and especially encryption, you only have to do one thing wrong and you lose. Most of us deal with this by figuring out who and what we trust because we cannot kniw everythimg, audit everything, fix everything, know the unknowable unknown, or even keep up with the ungodly pace of our own ever-increasingly-complex and specialised forms of expertise.
Try Korora. It might be up your alley. You can even liveboot it from your mac hardware, and the fedora line (of which korora is a part of) is always quick with patches and (esp important with apple products) drivers. See if you can tolerate it for a few days. If you can, then I will gladly provide info on how to install it and lock things down a bit. You might prefer mint though if you are a hardware junkie who upgrades their machine often you will likely run into driver problems.
This may double-post. I inadvertently screwed up my “name” field. Reposting:
Only I wasn’t being condescending. I understmand that it might be easy to see my response through the filter of “OS Wars”, but I generally find those sorts of people irrelevant since usually their feelings about the OS come into play instead of the practicalities and most of them have no knowledge of any OSes underlying concepts and weaknesses. Pointing that out isn’t an insult, and many people have experience in other things I do not. I would certainly never, for example, get into an argument about golf or fashion (though I might about the physics of golf, or the sociological/psychological/anthropological/economic aspects of fashion). As much as I want to understand the world about me there are some things that I would really really need a reason to get me interested in learning about it, and not all of those reasons have to do with a paucity of time.
I apologise if it seems like I was insulting you but your tone was rather distressing, especially as I have spent hours on this post giving freely of my (professional) time, knowledge, and experience not because of any desire for thanks (and I use this username nowhere else; I don’t care about any of that) but pretty much because I find this article to be dangerously lax for something that claims to “mean it”.
I was probably too terse. The problem is you are putting yourself in a situation where you want to run OSX .and use Apple hardware (and both have their elegances at times) but at the same time that means to be anything like secure you need to learn so so so much and even then you still can be outsmarted (as could any expert; we can only aspire to be our best, and nobody is perfect, at anythung, but when it comes to security, and especially encryption, you only have to do one thing wrong and you lose. Most of us deal with this by figuring out who and what we trust because we cannot kniw everythimg, audit everything, fix everything, know the unknowable unknown, or even keep up with the ungodly pace of our own ever-increasingly-complex and specialised forms of expertise.
Try Korora. It might be up your alley. You can even liveboot it from your mac hardware, and the fedora line (of which korora is a part of) is always quick with patches and (esp important with apple products) drivers. See if you can tolerate it for a few days. If you can, then I will gladly provide info on how to install it and lock things down a bit. You might prefer mint though if you are a hardware junkie who upgrades their machine often you will likely run into driver problems.
Not to sound condescending (too soon to poke fun at myself?) but I really would suggest you do away with any DMA ports whatever you decide. From a physical access standpoint you are far better off. As much as encryption is extraordinarily important, it really winds up being of not much use if you are caught out at the wrong time or in the wrong situation or attacked by anyone, random or targeted, in the software sphere with enough skill and intention to care about your data, or they get to your hardware itself and they get to do things that are not in your best interest (as determined by you). DMA is a little bit like autorun was with windows. Thunderbolt and Firewire can basucally provide ways-in that bypass the lockscreen (and more) and enable people to run all sorts of nasty things.
And if you do stick with OSX, as much as I totally distrust Yosemite, your threat model probably calls for upgrading to the latest version of it if you haven’t already (though I stress I do not trust it). That bug I linked to is still somewhat exploitable even then, afaik. The model went wonky. Lately I have been thinking of Yosemite as the Vista of OSX. And the rapid release cycles that began with Yosemite just as other changes did in how upgrades are pushed worries me greatly, not only because of the lock-in.
“world about me” in the “everything surrounding everyone” sense, hopefully obviously. Semantic quirk, self-conscious about now given context, bad timing.
Let’s put our little spat to one side, I must’ve jumped to conclusions and overreacted.
For now, I’ll muddle along using OS X (the absolute latest version, I always make sure of that), but as I’ve said, perhaps, one day … Then, I’ll certainly look back at what you’ve written here, and I’m sure there are others who’ll find this useful as well. I’d like to properly thank you after all, even if that’s not what you’re after, your efforts really are much appreciated.
I never really even considered it to be a spat. I don’t really think in terms of spats… just a misunderstanding, which usually just makes me double down on trying to explain where I am coming from, and double check my interpretation of where the other person is. If it is a matter of politics or something (arguably) more a matter of opinion I may engage as a debate or I may driop it. If it us technical or involves a subject I would consider myself knowledgeable of, I will keep pressing until I at least get my points across. I think you are making mistakes but that is your choice to make, as long as you make them going in knowing the risks and accepting them and their possible consequences. Anything more from anybody is too close to authoritarian.
I hope you will change your mind but it is more important to your own security to know your risks and be mindful of them. Maybe it’ll prevent you from using sleep instead of powering down, or not clicking a questionable link, or whatever might come down the line.
Good luck wherever life takes you :).
PS: I would be remiss if I did not mention you can multiboot also. :p
(and while I am not sure I deserve thanks, you are quite welcome)
Just put a sticker with your phone number on the bottom of your laptop. And glue your card under laptop battery. That’s it.
As always, in every article regarding full disk encryption, little or no detail is given regarding how linux full disk encryption (in ubuntu’s case LUKS) really works and why you need it from installation (the fact it is far superior than BitLocker and FileVault). Since it’s aimed at the broader audience, I don’t see much problem. It’s just interesting to see the same things happening almost every time you talk about FDE to the masses.
The bit about linux is interesting. Linux natively uses partitions (avoiding for now advanced disk schema like Volumes and RAID). You should be able to install linux to the root partition normally then encrypt a user data partition which you could mount later. Or you can create a large encrypted file and then mount that like a partition. Or you can run a virtual machine mounted from an encrypted root partition then mount an encrypted data partition.
Or you could encrypt a 16 GB thumb drive. This keeps the data separate from you computer. The compter remains “clean” and all the secure data remains in one place. I like the thumb drives that look like keys. They fit neatly on your key chain and go where you go. If you lose it, it’s encrypted. For about $9, Kingston Digital DataTraveler SE9 16GB USB 2.0
Linux is very flexible.
I would advise against installing an unencrypted root partition with an encrypted user data partition. You should aim to have everything encrypted.
The reason is, often without your knowledge, the OS will maintain metadata or caches outside of the expected user data locations. For instance, if you are downloading sensitive data from an email attachment using firefox, firefox may write the downloaded file to /tmp .
When you power down the device, your user data files may be encrypted, but the OS has leaked potentially devastating information to unprotected drive space.
You may think /tmp or other temporary directories are “deleted” by the OS, but there is a big difference between a “delete” and a “secure delete”. Typically, deleted data is still readable on the drive with forensic tools unless that region on the drive has been overwritten with other data. See the linux command “shred” for more information. (shred is also available on Mac I believe, can someone confirm?)
No I don’t think you give a good advice in encrypting the whole root partition. HOME partition should be encrypted, and that can be done during (Ubuntu) or after (Debian) install. Linux is not such stupid as to leak “potentially devastating information” to unprotected drive space. For example firefox uses profiles, and a profile is stored in the home partition, with all the tmp and cached related files. The SWAP partition, where all sensitive data could be leaked, is also encrypted by default when you choose to encrypt your home folder. Encrypting the root partition is useless.
It can be argued that a carefully installed system will not leak data to the root partition.
But it can also be argued that encrypting the root partition is harmless and therefore there is no net benefit to leaving it unencrypted.
Furthermore, LUKS provides you with some level of protection against tampering with the encrypted data (not perfect), but if you leave root unencrypted, you have no such protection whatsoever. A malicious actor with physical access to your machine could then modify OS files to compromise the LUKS key. Let’s say they modify your OS to perform a memory dump at boot time and hide that memory dump in unallocated blocks within the unencrypted partition. When the attacker returns to the device while you are away they then will be able to decrypt the entire LUKS drive. This attack would be not be noticeable to the user, maybe bootup will be slower than usual.
Why take chances like this if there is no real benefit to be gained from leaving the root partition open? Might as well encrypt.
I live boot your machine. I manually mount /dev/sda0 (or hda0 etc). I replace your kernel with my kernel. I shut down your computer. I win.
You can actually, easily, encrypt /tmp separately. You can also hot encrypt your swap partition on each boot which is something Micah didn’t even suggest. Sigh. And yes, you shouldn’t keep /boot local. /boot has your kernel etc. Any malicious actor can bludgeon you easily that way without even hitting the bios.
srm is the native equivalent on osx. it’s a bit quirky imho but to its credit it can follow directory trees (unlike shred as distroed). or you can use an external tool to wipe files.
Thanks for shedding light on that issue. Given how small laptop disks are, an alternative to carrying your laptop around rather than leaving it in your hotel room is simply to remove the hard disk and slide it in your pocket. It won’t keep you safe from hardware key-logger installations behind your back but the biggest threat to your data by far is the run of the mill robbery of your laptop by hotel maids (it’s already happened to me) and, unfortunately, many hotel room safes are too small to hold a laptop.
Note typo: “In May of 2015, the security community went into shock…” (you want 2014)
I am disappointed that the EFF’s Diskcryptor — only solution you mention for the average laptop owner who doesn’t want to pay extra — is given only one line of prose, and the spinoffs mentioned by Anon receive none, while solutions from the major companies who must surely maintain backdoor access per national security letters and half-a-dozen other reasons seem to be in the forefront.
BitLocker? You can’t be serious! I’ll stick with Truecrypt – or one of the maintained spinoffs.
You do realize that BitLocker is a closed source product from the same company that brought you http://en.wikipedia.org/wiki/NSAKEY, right?
also, what happened to the elephant diffuser in windows 8? And why?
It’s good to point out the issues with Windows integrated encryption solution Bitlocker. For most things (loosing your laptop or having it stolen by someone) it should work just fine.
However much of the article talks about your laptop getting copied at the border, and if you’re worried about the government getting access to things on your laptop (or just don’t like the possibility) you should look elsewhere as it should be assumed that copy made at the border can be decrypted afterwards. This is Microsoft’s integrated solution for Windows and we know that Microsoft has been (and presumably still is) one of the NSA’s best partners (and presumably, as an American security related company, BitLocker is a partner with the NSA as well) Here’s a little of what we do know Microsoft has done, its safe to assume BitLocker is compromised as well:
http://www.theguardian.com/world/2013/jul/11/microsoft-nsa-collaboration-user-data
As others have pointed out, use something else on Windows if you care of the government being able to access your data.
100 percent agree with you! From one geek to another
Wow. You really recommend a proprietary, closed source product from Microsoft in order to keep something “safe”? Are you high?
First, I don’t believe that BitLocker has a backdoor. I’ve never seen credible evidence for this, even looking for it in the Snowden archive (if you have information I don’t have, please share it, but rants against Microsoft aren’t the same as evidence of a backdoor). Just because BitLocker is proprietary doesn’t make it malicious, it just means that it would be easier for Microsoft to hide a backdoor if they wished to. This is equally true for Apple’s FileVault.
The Snowden archive has shown us though that the CIA has been secretly working to attack TPMs in order to steal BitLocker keys, which I think is pretty strong evidence that the US government doesn’t already have a backdoor. Why would the CIA invest in breaking BitLocker if they already had the keys?
I believe that the US government attacks Windows computers the same way everyone else does, using software exploits. This is probably easy for them on many Windows computers because Microsoft shares information about vulnerabilities with the government before fixing them.
The problem with TC or VeraCrypt (an actively maintained fork of TC) is they require that you disable secure boot and UEFI booting, they don’t work with how new versions of Windows partition their hard drives, they don’t offer protection against evil maid attacks that BitLocker does with the TPM, and in generally they don’t really work well with the future of Windows for full disk encryption. (But they still work fine for encrypted USB sticks or containers.)
I think that that open source crypto is definitely better than proprietary crypto, but for the best user experience, and the best quality integration with the OS, it’s better to use the crypto that comes with the OS. In the case of Windows and Mac OS X, that means using proprietary crypto. If you’re extremely concerned that proprietary software has backdoors, then why are you running Windows or OS X to begin with?
“Why would the CIA invest in breaking BitLocker if they already had the keys?”
The CIA might be doing this because they don’t know about / have access to the keys held by the NSA in partnership with Microsoft / Bitlocker. This was already occurring with regards to other things the NSA was doing that the CIA didn’t know about (presumably too secret or just not sharing enough). I wouldn’t trust BitLocker because of the Microsoft angle and also the American based security related software company…JMHO, why take the chance if you care about the govt being able to take away your privacy.
http://arstechnica.com/tech-policy/2015/04/25/cia-couldnt-fully-use-nsa-spy-program-as-most-analysts-didnt-know-about-it/
Micah,
I agree with you regarding that BitLocker might not have any explicit backdoors embedded into it. That’s not to say that microsoft would not put some design flaws on it to facilitate attacks. For instance, they might make side channel attacks more easy to carry by intentionally make design choices or even crypto primitives implementations that would facilitate these attacks. They can even make it only minimal vulnerable so that only government actors could exploit it. I believe that this should be mentioned and be integral part of the threat model. I see many people recommending BitLocker, but not mentioning these risks. And also, the fact that you need to engage in registry hackery to enable a PIN, should dissuade most people from implementing one. These things should be weighed in when suggesting it.
“First, I don’t believe that BitLocker has a backdoor.”
“..which I think ”
“I believe that the US government attacks Windows computers..”
“…I think that that open source crypto”
The last time I heared such strong believe was in church…where they believe in an invisible man called god ;-)
What you “believe” or “think” is really your personal problem my friend ! Some people believe their food is handpicked and that cows live happy on green meadows… but we all should KNOW by now that believing is another term for NOT KNOWING ! And that should be the main point in security !
So,.. pls go back to church and stay out of science matters… You really don’t help the cause with your unqualified comments, you’re just confusing people.
For all others, do NOT use Bitlocker (for all reasons mentioned from the security community), do NOT use Windows or Apple Products AT ALL ! And get rid of your mobile, as that is the main source for the government(s) to get all of your data and privacy info. TrueCrypt IS proven to be SAFE ! And any other software is lacking this proof until now as far as I know.
But if you like to stay in your bubble and ignore all warnings,… well, don’t say you didn’t knew when the black men are comming to pick you up for “questioning” or because you post all your shit on twitter and facebook… Democracy is just a myth ! And if you don’t care, no one can help you !
BTW: The Snowden files are mostly 10 years old… do you believe they made no progress since then ??? Then maybe you should google for “Bluffdale”.
:D
The real problem isn’t necessarily backdoors (most likely put there by planted programmers) although that happens; but full source code access by a government, allowing easier exploits, something the great hard drive firmware scandal just shone a bright light on. And of course, leakage of that.
nomentanus, I am pleased to have linked our surveillance contact chains. ;)
I would say as a rule of thumb, if your threat model includes police or government actors, then you need to get serious and drop Microsoft altogether.
If your threat model does not include police or governments, just typical thieves, Bitlocker is better than nothing.
But if you are doing anything at all serious, please find a Linux distro.
openbsd, netbsd, even smaller attack footprints.
btw it still niggles me that we are talking about encryption but noone has mentioned linux has grsec/selinux/pax which goes waaaay far when you are dealing with adversaries post-boot-up. If you’re willing to give linux a go you may as well do it right.
:^) !!!
Good article but I am surprised to see no mention of any of the truecrypt forks. I use veracrypt and trust it over bitlocker for the obvious reasons. Open source, actively maintained, no involvement of a known NSA partner like microsoft. Would like to hear Micah’s thoughts on this.
I don’t trust veracrypt. While it may or not be ‘ok’ now, given their dev model, I doubt it’ll be okay later.
Can you be more specific about your objections to their dev model?
I would prefer not to get too specific. I keep rewriting this reply.
This is a bizarre resume for a crypto person btw: https://fr.linkedin.com/in/idrassi .
I don’t know the guy but a better question is usually why should you trust not why shouldn’t you. I could pick on his company beinf trusted to do secret work for Gemalto but I won’t because that’d be ridiculous.
But I am disturbed at someone long-run implementing crypto who doesn’t know discrete from discreet. Considering the nature of both words and their relation to crypto.
Aside from a few tools he has a lot of noise and no real info out there.
Beyond that I do not believe he has the background to do this well, and TC was NOT giving permission to fork it. That is the safest and least controversial answer I can offer. At a minimum I would expect accidental misimplementation. It just isn’t worth the risk (certainly not yet) of trusting it.
Don’t do Cryptoshed either — it adds the awfulness of a plethora of contributors.
Stick with the original last full version of TC for containers.
Sorry this answer isn’t great. I may be persuaded to get more technical and do some auditing but I am exhausted and his dev cycle isn’t what I would call stable or well-tested.
Thanks. Your main reason appears to be lack of faith in the developer’s abilities. Unfortunately I have no idea about this person, so can’t say anything about that.
I trust veracrypt for now because TC was audited and I am assuming that the changes since then has been looked over by other apart from Mounir. Going forward, my takeaway is that for trust the best slution is to have regular independent audits (presumably by the same group that audited TC). Ideally any such audit would include not just veracrypt but also any other commonly used TC fork or even other open-source disk encryption software.
Unless or until Veracrypt offers a FDE replacement, I wouldn’t bother auditing something that changes so often. The more you change a well-written piece of crypto code that works, the more the sneak can become snuck, either accidentally or deliberately. Why a rolling dev cycle? What “features” do you need to be continuously updated with a crypto program? Aside from repairing bugs, and I would argue there were very few worth bothering with by 7.1a, you should NOT be mucking with working encryption code. What good reason do you possibly have?
(so no, it isn’t (just) “a lack of faith in the developer’s abilities”)