Recently, I wrote a guide explaining how to encrypt your laptop’s hard drive and why you should do so. For the benefit of Windows users, I gave instructions for turning on BitLocker, Microsoft’s disk encryption technology.

This advice generated an immediate backlash in the comments section underneath the post, where readers correctly pointed out that BitLocker has been criticized by security experts for a number of real and potential shortcomings. For example, BitLocker’s source code is not available for inspection, which makes it particularly vulnerable to “backdoors,” security holes intentionally placed to provide access to the government or others. In addition, BitLocker’s host operating system, Microsoft Windows, provides an algorithm for generating random numbers, including encryption keys, that is known to have been backdoored by government spies, and which the company’s own engineers flagged as potentially compromised nearly eight years ago. BitLocker also lost a key component for hardening its encryption, known as the “Elephant diffuser,” in the latest major version of Windows. And Microsoft has reportedly worked hand-in-glove with the government to provide early access to bugs in Windows and to customer data in its Skype and products.

Even having known about these issues, I still believed BitLocker was the best of several bad options for Windows users; I’ll explain my reasoning on this later.

But in the meantime, something interesting has happened: Microsoft, after considerable prodding, provided me with answers to some longstanding questions about BitLocker’s security. The company told me which random number generator BitLocker uses to generate encryption keys, alleviating concerns about a government backdoor in that subsystem; it explained why it removed the Elephant diffuser, citing worries over performance and compatibility that will appease some, but certainly not all, concerned parties; and it said that the government-compromised algorithm it bundles with Windows to generate encryption keys is, by default, not used at all.

Significant questions remain about BitLocker, to be sure, and because the source code for it is not available, those questions will likely remain unanswered. As prominent cryptographer Bruce Schneier has written, “In the cryptography world, we consider open source necessary for good security; we have for decades.” Despite all of this, BitLocker still might be the best option for Windows users who want to encrypt their disks.

Today I’m going to dive deep into the concerns about BitLocker and into Microsoft’s new responses. I’m also going to explain why more open alternatives like TrueCrypt don’t resolve these concerns, and take a brief look at proprietary products like BestCrypt, which Schneier recommends.

This is going to be a fairly technical post. But it’s important to explore the current state of BitLocker because Windows remains the most popular operating system for personal computers and because interest in BitLocker has only grown in the wake of documents from NSA whistleblower Edward Snowden showing widespread U.S. government surveillance. At the same time, fears about BitLocker have also been stoked by the Snowden cache, which exposed a carefully orchestrated and apparently successful attempt by the National Security Agency to compromise international encryption-related standards, including one that’s part of Windows to this day.

Why people worry about BitLocker

If you can trust Microsoft, BitLocker has always been awesome. For example, Microsoft is well ahead of competitors like Apple in making BitLocker verify that an attacker hasn’t modified the software used to boot the computer. Without such protection, hackers can rewrite the boot-up code, impersonate the operating system, and trick people into unlocking the disk so malware can be installed, a technique known as an “evil maid” attack. Mac OS X and Linux’s disk encryption systems are entirely vulnerable to this attack, but Windows, when running BitLocker, is not.

Of course, a great many people, particularly in information security circles, do not trust Microsoft; these people worry that BitLocker’s advanced technology is meant to distract people from the company’s cozy relationship with the government, and that any data “secured” using BitLocker could be handed over to spy agencies or law enforcement.

Here are three more specific concerns those people have about BitLocker — concerns I have shared. With each, I’ve included Microsoft’s response. It should be noted that the company was not initially forthcoming with this information; a spokesperson responded to a set of questions based on these worries by saying the company had no comment. To Microsoft’s credit, the company later reversed this position.

They fear BitLocker’s encryption keys are compromised by default. They’re not.

Encryption relies on random numbers. For example, when you enable BitLocker for the first time, you need to create an encryption key, which is just a random number within a specific range. You can think of a 128-bit key, the kind used by BitLocker by default, as a random number between 0 and 2128 (it would take 39 digits to write out that full number). The security of a 128-bit encryption key comes from the fact that there are just too many possible numbers in that range for an attacker to ever try them all.

When BitLocker generates a key, it asks your computer for a random number within that range. But where does this number come from? This is an enormous problem in the fields of cryptography and computer science because computers are, by their very nature, deterministic, not random: programs act the exact same way every time you run them because they’re executing the exact same set of instructions. But getting real random numbers is critically important. If an attacker can predict which random number your computer chooses, then that attacker can break the encryption that relied on that number. So when you ask your computer for a random number, it uses a cryptographically secure pseudorandom number generator (CSPRNG, or just PRNG) to generate one for you.

One such PRNG is not actually cryptographically secure and is almost certainly compromised by the NSA: Dual_EC_DRBG, or Dual Elliptic Curve Deterministic Random Bit Generator, an algorithm blessed by the National Institute of Standards and Technology in 2006 — and it’s built into Windows. If an encryption key for a system like BitLocker is generated by a compromised PRNG, the owner of the backdoor could figure out the key through sheer repetitive guessing, a so-called “brute force” attack, in a much shorter amount of time: minutes, hours or days rather than the billions of years required to figure out a key generated by a secure PRNG.

In 2007, Niels Ferguson, a Microsoft cryptographer who worked on BitLocker, along with Dan Shumow, another Microsoft engineer, gave a presentation pointing out that Dual_EC_DRBG might have a backdoor. In 2013, the the New York Times, Pro Publica, and The Guardian, drawing on documents provided by Snowden, reported that the algorithm did indeed contain an NSA backdoor. In the documents, the NSA wrote about the “challenge and finesse” involved in pushing a system it had engineered onto standards groups and bragged that it became “the sole editor” of the standard that eventually emerged.

Microsoft told me that while the backdoored algorithm is included with Windows, it is not used by BitLocker, nor is it used by other parts of the Windows operating system by default.  According to Microsoft, the default PRNG for Windows is an algorithm known as CTR_DRBG, not Dual_EC_DRBG, and when BitLocker generates a new key it uses the Windows default.

“It has never been the default, and it requires an administrator action to turn it on,” a Microsoft spokesperson told me.

So BitLocker keys appear to be generated in an entirely secure way.

They fear the Elephant diffuser was removed to make BitLocker weak. Microsoft says it’s because Elephant is slow. BitLocker remains weakened.

While Microsoft has now reassured users that the random numbers used to secure BitLocker are secure, it is still worrisome that the company removed an important security component from BitLocker’s architecture.

When BitLocker was first rolled out in late 2006 and early 2007 as a feature of Windows Vista, it used a well-known cipher, or encoding engine, called AES-CBC, along with something called the Elephant diffuser. Ferguson published a paper explaining that without the diffuser, AES-CBC is “not suitable” because “it should be relatively easy to mount an attack.”

The Elephant diffuser plays an important role in protecting an encrypted disk against modification by an attacker. Allow me to explain: An encrypted disk is full of scrambled bits (zeroes and ones), but once the disk is unlocked, those bits get unscrambled to make up meaningful files, including the programs that constitute Windows. Without the Elephant diffuser, an attacker with physical access to the encrypted disk and with knowledge of exactly where on the disk target files are located could modify specific scrambled bits, which will in turn modify the targeted files in an exact way when the disk is later unlocked. For example, they could modify one of the programs that runs while Windows is booting up to be malicious, so that the next time the user unlocks the disk and boots Windows, malware automatically gets installed. The Elephant diffuser prevents this attack from working. With the diffuser, an attacker can still modify scrambled bits, but doing so will prevent them from having fine-grained control over exactly what changes they make when the disk is unlocked. Rather than being able to make specific programs malicious, they are more likely to scramble large chunks of programs and simply cause the computer to crash instead of getting hacked.

But in October 2014, cryptographer Justin Troutman noticed that the version of BitLocker in Windows 8 silently removed the Elephant diffuser even though it still uses AES-CBC. Microsoft’s technical overview of BitLocker lists the Elephant diffuser as “removed or deprecated,” with no further explanation.

“When I discovered it was removed, I was a bit perplexed, simply because there was no public announcement that I could find, despite the effort that was put into building it,” Troutman says.

Schneier was also concerned. “It makes no sense,” he told me. “The Elephant diffuser was a good idea.”

Microsoft says the diffuser was too slow and kept BitLocker from being activated by certain users, including government contractors and agencies that must comply with Federal Information Processing Standards, or FIPS. “[The Elephant diffuser is] not FIPS compliant, so certain companies and government clients can’t use it,” a spokesperson says. “It’s not supported by hardware acceleration, thereby impacting performance on low-powered devices.” The company did not provide answers when I asked if Microsoft has plans in the future to add another diffuser to replace the one they removed.

While removing the Elephant diffuser might help make BitLocker faster and more compatible with use within government, it does make BitLocker more vulnerable to attack — according to Microsoft’s own engineers. Again, it was Ferguson, then and currently a Microsoft cryptographer, who in 2007 wrote with another Microsoft engineer that with BitLocker’s cipher, AES-CBC, and without a diffuser, “it should be relatively easy to mount an attack … [AES-CBC in BitLocker] is not suitable, due to the lack of diffusion in the CBC decryption operation.”

Removing the Elephant diffuser doesn’t entirely break BitLocker. If someone steals your laptop, they still won’t be able to unlock your disk and access your files. But they might be able to modify your encrypted disk and give it back to you in order to hack you the next time you boot up.

To be fair, disk encryption technology used in Linux, LUKS, used to be vulnerable to this same type of attack by default. This changed in early 2013 when LUKS switched from using AES in CBC mode (the same as BitLocker today) to AES in XTS mode, which prevents this attack.

They worry Microsoft will betray its users again. Microsoft says it will comply with lawful requests.

While it’s helpful that Microsoft is addressing specific concerns about BitLocker, it’s possible to look at the company’s track record and decide you cannot trust the company in general. In particular, it’s not clear how much users who want to keep their information out of the hands of the government can trust Microsoft, which has a history of working with U.S. law enforcement and spy agencies.

Drawing on Snowden documents, in June 2013 the New York Times disclosed the existence of a secret program called Project Chess, run by “fewer than a dozen” Skype employees after eBay bought their company, but before Microsoft acquired it. Project Chess was designed to “explore the legal and technical issues in making Skype calls readily available to intelligence agencies and law enforcement officials,” the newspaper wrote. The Times pointed out that after Microsoft purchased Skype, Skype denied changing its architecture to make it easier for law enforcement to spy on its users — without disclosing that Skype’s architecture was already designed to do this.

Likewise, in July 2013 The Guardian reported that Microsoft “has collaborated closely with U.S. intelligence services to allow users’ communications to be intercepted, including helping the National Security Agency to circumvent the company’s own encryption.” In this case, Microsoft helped the NSA access web chats and email from the portal.

Microsoft responded to these allegations at the time in a blog post explaining that they don’t give unfettered access of user data to the government, and that they only comply with valid and specific legal requests. Asked about instances in which Microsoft built methods to bypass its security and about backdoors generally, a company spokesperson told me that Microsoft doesn’t consider complying with legitimate legal requests backdoors.

In addition to sometimes sharing user data from Skype and, Microsoft also reportedly shares information on bugs with security implications. Such bugs, before they are fixed, can be used in much the same way as backdoors. In fact, in many situations disguising a backdoor as a security bug is a great way to hide it because it provides plausible deniability. If your backdoor is ever discovered, you can claim that it wasn’t a backdoor at all but rather a bug that you didn’t know about. Bloomberg reported in 2013 that “Microsoft Corp., the world’s largest software company, provides intelligence agencies with information about bugs in its popular software before it publicly releases a fix.” These bugs, if weaponized, could be used to access any computer running vulnerable Microsoft products.

A Microsoft spokesperson said that Bloomberg’s reporting referred to Microsoft’s Government Security Program, in which the company works with national governments to “help them build and deploy more secure IT infrastructure and services that further protect their citizens and national economies.” This program includes access to source code for key Microsoft products — so governments can check it for backdoors, the spokesperson told me — as well as “vulnerability and threat intelligence” from Microsoft. Microsoft says its intention is to be transparent, not to aid spy agencies in making malicious software. But it’s easy to imagine how a national government could repurpose Microsoft’s data in ways Microsoft may not have intended. And it’s worth noting that Microsoft’s transparency is only afforded to powerful national governments rather than to regular users.

I asked Microsoft if the company would be able to comply with unlocking a BitLocker disk, given a legitimate legal request to do so. The spokesperson told me they could not answer that question.

What about TrueCrypt?

For all of the concerns around using BitLocker, there is an extremely popular and apparently cryptographically solid alternative called TrueCrypt. The program has many fans; after I wrote my last column on disk encryption, TrueCrypt advocates inundated me with comments and tweets arguing that I should have recommended it rather than BitLocker because it’s open source, and because they believe BitLocker has a backdoor.

TrueCrypt has been around for more than a decade and its high-profile users include Snowden, who was spotted teaching people how to use it at a CryptoParty in Hawaii before he was widely known as an NSA whistleblower. TrueCrypt works in Windows, Mac OS X and Linux, and it’s able to encrypt USB sticks, hard disk partitions, and to create encrypted containers, which are files that can securely store other files inside of them. You can also download and inspect the source code.

Windows users get one extra feature, arguably TrueCrypt’s most important one, called “system encryption,” which is TrueCrypt’s name for full-disk encryption, disk encryption that’s applied to the hard drive used to start your computer. Before BitLocker existed, TrueCrypt was being used to encrypt Windows XP systems, and even after BitLocker was introduced, TrueCrypt remained popular through the late 2006 and early 2007 release of Windows Vista and the 2009 release of Windows 7 because, unlike BitLocker, it could be used with the cheapest editions of Windows.

What’s more, TrueCrypt’s security has been publicly audited, and there are no signs of backdoors or major security issues.

But there’s a hitch: With the release of Windows 8, TrueCrypt became painful to use for full-disk encryption. If you’ve bought a PC since 2012, when Windows 8 came out, chances are you can’t use TrueCrypt to encrypt your hard disk without jumping through quite a few hoops. You may need to figure out how to open your boot settings and disable security features to get it working, or format your hard disk to use a different, older system for organizing the disk that’s compatible with TrueCrypt. To put it bluntly, TrueCrypt is Windows XP-era software. As modern PCs and the Windows operating system evolved, TrueCrypt stayed in the past.

Part of the problem is that TrueCrypt is locked out by the very boot-up system that helps make BitLocker so secure. When you power on a Windows 8 computer, it runs a chunk of code that interfaces the computer’s hardware and its operating system. This code runs a security check to make sure none of the start-up software has been tampered with by a hacker; Microsoft code passes the check, since it’s been cryptographically marked as “trusted,” but software that isn’t marked this way, like TrueCrypt, fails the check and prevents the computer from starting. In most PCs it’s possible to turn off the security check, but this involves tinkering with hard-to-reach security settings; the instructions for configuring your boot-up code are different on pretty much every computer.

If this weren’t bad enough, TrueCrypt and its derivates only support encrypting disks that use outdated partition tables. A partition table describes the different sections a hard disk has been split into. Older systems used Master Boot Record (MBR) partition tables, but newer computers come formatted with GUID Partition Table (GPT) — a system TrueCrypt does not work with.

There’s little prospect these problems will go away anytime soon. In May 2014, at the same time that Microsoft officially stopped supporting Windows XP, TrueCrypt’s developers publicly abandoned the project and TrueCrypt’s website was replaced with instructions for migrating to BitLocker.

The shutdown erodes more than just TrueCrypt’s compatibility with Windows; security is at stake, too. If someone finds a security bug in TrueCrypt in the future, this bug will never get fixed.

Two new projects have forked off of the TrueCrypt codebase — VeraCrypt and CipherShed. Both are under active development. But both still suffer from all the same Windows full-disk encryption issues that TrueCrypt suffers from (though for cross-platform encrypted USB sticks and file containers, they still work great).

Even if these projects eventually fix these issues and gain support for modern PCs, their development is hampered by TrueCrypt’s licensing terms, which don’t qualify as either “open source” or “free software” under standards laid down within those programming communities. Since VeraCrypt and CipherShed are forks of TrueCrypt, they’re forever locked into this unfortunate license. Because TrueCrypt and its offshoots don’t meet standard definitions of free and open source software, none of the popular distributions of the Linux operating system include them in their software packaging repositories, an omission that makes installation and updating a pain for Linux users. TrueCrypt’s non-standard license and lack of Linux packaging makes free and open source advocates hesitant to throw their weight behind it, which may slow down development of bug fixes and features for all operating systems, including Windows.

“This business is all about trust”

There’s no reason the discussion of Windows encryption should be confined to BitLocker, TrueCrypt and TrueCrypt’s offshoots.

When I began talking to Schneier about full-disk encryption for Windows, he told me about a product called BestCrypt. Like BitLocker, it isn’t open source or available free of charge. Unlike BitLocker, the company that develops it doesn’t have a public history of making its products accessible to law enforcement and spies. And unlike TrueCrypt, VeraCrypt or CipherShed, BestCrypt has great support for modern Windows computers: It supports the newest versions of Windows, Microsoft-sanctioned security checks at boot time, and modern hard-drive formats.

Considering Schneier has been outspoken for decades about the importance of open source cryptography, I asked if he recommends that other people use BestCrypt, even though it’s proprietary. “I do recommend BestCrypt,” Schneier told me, “because I have met people at the company and I have a good feeling about them. Of course I don’t know for sure; this business is all about trust. But right now, given what I know, I trust them.“

There are other full-disk encryption options for Windows as well, such as Symantec Endpoint Encryption (proprietary) and DiskCryptor (open source).

Every single option for disk encryption involves a trade-off between quality and transparency. No product is perfect. For all their transparency, open source projects have recently had some critical security issues surface, and many don’t have the resources to hire a team of security engineers like Microsoft does. Open source projects also tend to be harder to use, and if average users can’t get an encryption product to work, they’re not going to use it. On the flip side, a piece of easy-to-use encryption software can be insecure, especially if complying with law enforcement requests is built in to the design, as in the case of Skype.

Balancing trust, ease of use, transparency, apparent robustness, compatibility and resources for squashing bugs, BitLocker comes out ahead for the average user. BitLocker has the home field advantage over the competition. Microsoft will make sure that BitLocker works great on every Windows device, and already fresh installs of Windows 8.1 turn on BitLocker by default if the computer has the right hardware. If the trend continues, in the future you won’t be able to buy a Windows device that isn’t already encrypted. If we can trust Microsoft to not include backdoors in Windows, this is great news.

Based on what I know about BitLocker, I think it’s perfectly fine for average Windows users to rely on, which is especially convenient considering it comes with many PCs. If it ever turns out that Microsoft is willing to include a backdoor in a major feature of Windows, then we have much bigger problems than the choice of disk encryption software anyway.

Whatever you choose, if trusting a proprietary operating system not to be malicious doesn’t fit your threat model, maybe it’s time to switch to Linux.

Updated to reflect clarifying follow-up comments sent by Microsoft on its definition of “backdoor.” June 4 3:00 pm ET

Correction: This post originally said LUKS encrypted disks are, by default, vulnerable to the same attack as BitLocker without the Elephant diffuser, but this isn’t true anymore. LUKS changed its defaults in early 2013 to be secure against this attack. June 6 2:20 pm ET

Photo: Thomas Trutschel/Photothek/Getty

We depend on the support of readers like you to help keep our nonprofit newsroom strong and independent. Join Us 

Contact the author:

Micah Lee[email protected]​

Leave a comment

After reflecting on this a couple of days Micah and reading a recent article, I think I’ve come around to what you’re saying – very good article. The biggest problem / most likely issue for a normal Windows (laptop) user is that they’ll loose their machine or get it stolen. On a non encrypted drive – bad people can set to work on the machine and get in fairly quickly with no heavy lifting (boot in via a USB linux drive & local users can literally have their pwords reset or wiped and users tied to Microsoft Cloud Accounts can get compromised too – then they have access to your cloud stuff too):

An encrypted drive would stop this dead. Using BitLocker on your laptop would be good for this (probably) most likely problem a user could be faced with.

That said, we know Microsoft is a “partner” with the NSA – so the user should expect that the encryption key you generate is passed back to Microsoft for storage / reference when its generated (that’s why you need internet access to complete the process) – originally this was probably done to handle problems for users, but of course the NSA loves having these at the fingertips of its NSL’s (although based on what we’ve seen with the NSA, they’ve problem stolen all of them if Microsoft didn’t just given them constant access). If the user doesn’t want to be open to the U.S. government, then they need to use something else (and for newer tech, that TrueCrypt won’t cover, if Schneier recommends something (BestCrypt), that’s a pretty good endorsement).

And what about key recovery and social engineering and other stuff? Again, you’re looking at the picture as though most people turn off their machines — and most people not only don’t, but they generally are most likely to get their machines stolen when it’s both turned on and not even too far from their possession — not from their house, in the dead of night (and even then, people rarely log off or even disconnect their wifi these days).

(that’s not the only thing, merely one or two of them).

I don’t believe in BitLocker so personally moved to another app that has complete TrueCrypt support:

I was a PM of bit locker in a former life.

Here is what I can say for certain, yes – we built ‘bit locker’ for consumers not law enforcement. So, one of the considerations is key recovery in the event customers required recovery. This of course can and will be used to support law enforcement as required by law, but it was intended to support customers with data recovery. And on that point:

Now, I also happen to own a hard drive encrypted with ‘bit locker’ that had a bug (god knows if it has a bug, I will find it), and this drive remains un-recoverable now for almost a decade. I actually believe this is a bigger issue – namely that bugs in bit locker can result in *complete data loss*!!!! (I consider this a bug, under no circumstance should data destruction be acceptable unless explicitly ‘desired’ by a consumer.)

Only in the most extreme situations is such security required because very few of us have a profile where our data being destroyed is preferable to our ‘secrets’ being exposed. While I will freely admit this include most people in the security community the world at large is much, much larger than the security community and does not face the same threats and do not require the same controls.

Therefore, if you ask me it is a complete failure.

Your milage may vary.

Are you still playing the game? You sound like a stooge.

Dear Micah:

Don’t forget about NSAKEY: .

To be honest, I think you erred by saying “They fear BitLocker’s encryption keys are compromised by default. They’re not.”. You don’t know that. They may be compromised by default. All you know is that they aren’t compromised by dint of Dual_EC_DRBG.



– If Microsoft implements backdoors in its products, you can be sure it’s because it is forced to by secret laws or executive orders. If this is the case, one can assume that any company of a certain size, or at the very least a company with a sizeable amount of activity in the USA, will be under the same kind of obligations.
– There are very little reasons to believe Linux is more secure or is backdoor free than Windows just by the fact that it is open source. First Microsoft gives access to its code to big states. Second, it is quite easy to modify open source code. Third, Linux is very very big nobody checks every line of code or file on a regular basis. Just look at the recent SSL bugs & such that were in full sights for more than a decade.

Not saying BitLocker is above reproach as I have not studied it in-depth, but assuming Microsoft always has evil intentions is as dangerous as assuming Linux is the alpha and the omega of correctness and security.

Without full access to the Bitlocker source, absolutely nobody can study it in depth. And of the people who might have full access to the Bitlocker source, how many of them are independent? And have the ability to understand the code?

There’s no excuse for closed-sourcing your crypto. Noone’s gonna up and take your encryption software, which only works on their OS, and use it on another OS (*caveat, someone might write code making it possible to mount Bitlocker volumes on another OS, godforbid).

I wouldn’t trust MS with full-disk encryption but I’d never even remotely consider trusting it with closed-source encryption of any kind. (And of course, it behooves us all to remember the seminal Thompson paper written decades ago, “Trusting Trust” — if the toolchain’s corrupted, all the source code analysis in the world ain’t gonna help you. I’m pretty sure they’re not gonna open-source Visual Studio.

This article seems to me like a mouthpiece for Microsoft.

BTW, while you are quoting “this business is all about trust”, it’s not — the saying is “Trust, but verify.”

Without verification, trust will get you.

People don’t come here for ‘guesses’. People look to your “technical experts” to know what they’re talking about.

Why aren’t you having actual technical experts who understand the topics written about as guest writers? You clearly allow guest writers (like the Ukraine articles, which are clearly VERY unbiased).

How many times does it have to be suggested? Why is it so difficult? I doubt you’d have a hard time finding people with the actual required skillsets willing to sit down and churn out an article or two every few months. The only conclusion I’ve been able to come to is that you do not WANT actual expert advice on tech or crypto here. But then why put out articles like you do, when they’re based on guesswork without groundwork?

From the article:
Coming here from just having read something on twitter:

“Asked about instances in which Microsoft built methods to bypass its security and about backdoors generally, a company spokesperson told me that Microsoft doesn’t consider complying with legitimate legal requests backdoors.” (From the article above)

If Microsoft did indeed answer something like this after a question about bypassing its own security, then obviously one might expect the whole encryption solution to be iffy at best; if Microsoft for example keep a copy of keys on your computer, or elsewhere, or simply code/program Bitlocker to have special access (backdooring) to Bitlocker or to the encrypted content on your computer, or allowing someone like NSA to make use of compromising bugs in Windows/Bitlocker, or even simply turning your computer to a espionage device for NSA on demand, only to get to the data encrypted by bitlocker.

I know to little about the technical nature of the following, BUT, I was unnerved when I heard that the beta version of the next windows, Windows 10 tech preview, had an official keylogger in it (and being public knowledge actually), I’d hate to see that functionality be a core component of windows 10 or merely being injected into ones Windows installation on ones PC, just because Microsoft is asked to spy on you.

Since Microsoft plan to use Windows 10 for not only PC but other hardwre (tablets, mobilephones), I would not be surprised if the role of a system wide keylogger, is a basic hidden and maybe unofficial feature, that can and will be used to spy on you, either for the governments of the world, or maybe for selling data about you to advertisers.

Afaik, Windows 10 comes in a consumer version and an enterprise version. The enterprise version is iirc said to keep enterprise data separate, how convenient I am thinking, because I imagine that even you were to get an enterprise version, Microsoft could still spy on you, and not risk spying on corporate data, or even better, getting to corporate data, being a separate part, would be even more easy, with corporate data in the enterprise edition being stored separately.

I will admit that what I write contain some FUD (fear, uncertainty, doubt), but I really, really, don’t trust Microsoft, I just feel I have to use Microsoft because of how Windows is a gaming platform. :(

I’d like to see The Intercept ask Microsoft, if that Windows 10 Tech preview keylogger is still to be present with the release of Windows 10 this summer, and I would ask, how easy would it be for Microsoft to install such a keylogger into a user’s computer if it woldn’t already installed with the upcoming release of Windows 10.

All recent versions Apple’s OS X have a not-quite-secret keylogger installed by default. Whatever you type into many of the built-in applications automatically gets broadcast over the internet and archived on servers that the NSA and other spy agencies have front-door (and presumably also back-door) access to. By the way, Apple touts this security hole as a “feature” that allows users to seamlessly switch between their iPad, iPhone, and Apple computer. The good news is that users can use the command-line Terminal to change the settings to stop tthe keylogging, but most Apple users won’t know about this.
Even Steve Wozniak acknowledges that Apple has become rotten to the core.
The reality in 2015 is that outfits like Apple, Microsoft, Adobe, Facebook and even Google exist for one purpose only: to make money. That is true even if every single person in these organizations is smart, well-meaning and altruistic. Amoral behavior is an emergent property of any large organization in the modern world.
An additional reality in 2015 is that Apple, Microsoft, Adobe and Google are all riding entirely on vendor lock-in. That means they’re freed from having to serve the buying public. That results in a new business model: the buying public aren’t the customers, they’re the product. There’s a huge amount of money to made in harvesting people’s private information and holding it for ransom, or selling – indirectly – that sensitive private information to third parties. Hence the agressive push by all of the big comanies to force the customers into the “Cloud” (aka remote servers controlled by those companies).
Advertisers might pay in cash for targeted ads, but spy agencies have other, less direct, ways of paying for information. No big company wants to offend the government of a large economy, here, or overseas.

the last line is the real subtext for the article:
“Whatever you choose, if trusting a proprietary operating system not to be malicious doesn’t fit your threat model, maybe it’s time to switch to Linux.”

I cant believe that in 2015 F/OSS advocates still desperately cling to the “many eyes” argument. How can all of those eyes have failed to find the Heartbleed vuln (backdoor?) introduced in 2011 until 2014? “Hey, I just finished searching all of my source for ‘NSA’ and ‘backdoor’ and it came back clean. Woo hoo – OSS is so secure!” There could be any number of very hard to find backdoors inserted into any OSS product. Having access to millions of lines of code without the ability to actually find anything is pure theater.

As for BestCrypt “not having a history of complying with government demands”, thats because almost no-one uses their product. But rest assured dear reader, those developers (and all OSS developers) would all happily go to jail for you rather than compromise your data by complying with a government demand. Wake up fanboys…

The other reason Bestcrypt has no history of complying with government demands is that you mean US government and Bestcrypt is Finnish. NSA finds it more difficult to corrupt civil liberties in free countries. Best security products are based outside 5 Eyes countries. That’s why many software companies are relocating out of 5 Eyes.

One reason having a third party provider of *closed* crypto with only crypto products is theoretically better than having it baked into your operating system is their ENTIRE business model is based on their reputation. Shockingly, this seems to be overlooked. Yes, I realise it’s easy to overlook it given complicity in RSA etc, but RSA has always been questionable. As have the standards committees, heavily packed, and not judiciously, with NSA people and defense contractor employees.

Closed crypto is ALWAYS bad… but closed crypto owned by a company that’d tank if it were publicly exposed as complicit, especially an overseas company with good cryptographic engineers (note, I didn’t say cryptographers — implementation is incredibly easy to mess up)… They have way more to lose, and peoples’ trust models need to reflect this. Especially if they’re publicly owned, but that part is neither here nor there.

“One reason having a third party provider of *closed* crypto with only crypto products is theoretically better than having it baked into your operating system is their ENTIRE business model is based on their reputation. Shockingly, this seems to be overlooked. Yes, I realise it’s easy to overlook it given complicity in RSA etc, but RSA has always been questionable. As have the standards committees, heavily packed, and not judiciously, with NSA people and defense contractor employees.”

I have no idea what you are arguing. “3rd parties are better, but hang on a sec, those are questionable to”. So if we cant trust the OS companies, 3rd parties, or the cryptographers creating the algorithms, then who can we trust?
And the argument that a company that only writes crypto is more trustworthy because they have a reputation to uphold? You think Microsoft doesnt realize how many customers would abonden it if it were found to be selling its customers out?
Its like this article – heavy on theory and speculation, but very light on detail…

I am saying that you shouldn’t trust any closed source crypto, but in a world of lesser or greater evils, it’s much harder to take down a behemoth like MS and its ilk — who clearly gets away with all this crap, as has been evidenced by the big reveals people have been ‘treated’ to without much revolt or fanfare, and a third party software which, while still “bad” for purposes of security and encryption, would likely completely collapse the company if it got caught out — because that’s its entire business model.

MS is a software (debatable ;)) company, but it’s also, and primarily for purposes of this argument, an OS company. People aren’t gonna walk away from their OS. They will, however, walk away from bad crypto companies, because it doesn’t require much but the purchase of a different solution (or, godforbid, an open source OS/solution) if they find out they’ve been betrayed. Whatever else MS is, it’s not a security company, and it never has been. Neither’s Apple. Or Google, for that matter.

All US software companies’ reputation for privacy is shot; not necessarily, as you imply, by their own actions or ethics, but by their ultimate obligation to NSA over their non-US customers. It’s happened. It’s history. The momentum is away from 5-eyes-corrupted companies to countries which take civil liberties seriously: Germany, Switzerland and others.
If MS and Apple were serious about client privacy they’d relocate to Germany or Switzerland.
Startups there will gradually eat them.

Why do you believe Switzerland and Germany are ‘safe’ from this? I assume we are both readers of this site and other sites that have been publishing all sorts of things about other countries also being complicit.

Conversely, a few (lol) “security” companies are moving out of GB/UK and TO the US to “avoid” GCHQ. Kind of frying pan, fryer there. Can’t make that stuff up. Wassenaar.

“In addition, BitLocker’s host operating system, Microsoft Windows, provides an algorithm for generating random numbers, including encryption keys, that is known to have been backdoored by government spies, and which the company’s own engineers flagged as potentially compromised nearly eight years ago”

Microsoft’s random number generator is a joke, I had to write my own random number generator because Microsoft’s is anything but random.

It takes a surprising amount of effort to create truly random (unpredictable) numbers. And it’s especially difficult to generate large amounts of random numbers in a secure manner. Here’s a good intro to the topic:
Do `if rand(0,1)` on PHP/Windows, create a bitmap image of the result, and you’re in for an unpleasant surprise:

Because writing your own crypto functions is such a good idea ;-)


I haven’t seen anyone talk about this: Can’t you use the quantum noise from a digital camera sensor as a source of random bits?
If you crank up the analog gain (mislabeled as “ISO”) high enough, the laws of physics would guarantee that the noise (the “grain” in digital photos) will consist mostly of random information, as fundamentally random as you can get in the real world.
If you know what you’re doing and know how to simulate/extract entropy from the datastream, you should be able to get gigabits per second of almost completely random 1s and 0s. Assuming, of course, no one has tampered with your camera and computer.
Of course, if you’re dealing with an adversary with state-level resources who may have developed a special interest in you, any modern consumer electronics are likely to betray you.

Aside from the technical debate – what it looks like is Microsoft wants to give me a new version of Windows which wipes out previous versions/features and it’s free because Microsoft cares about me and Microsoft doesn’t care if they make any money on it? Right…..

This is madness, pure madness!!This whole thing of the internet. It was suppose to make our lives better in so many ways but it’s come to a point where it’s being used to get into your head in every way possible 24/7. The very thing that is suppose to help is now being used against us and against each other and when will all this end? It’s infuriating to hear day after day of all manner of spying, hacking, and improprieties. We are supposedly seeking terrorist but no one is asking why they came into existence in the first place and to think the US is innocent is just another form of madness. As long as we have nations like the US imposing its will on others things will only get worse. Mayhem and chaos is what the US does best but some day it will pay a heavy price for it.

Toby has an elephant inside? OMG…is it PINK?

Your system is on fire. Unplug it and immerse it in the bathtub for an hour. That should dowse the flames and then you can junk your system Repeat for all the laptops and computers and then go find Lousy Cipher to help you out.

You’d think fun wouldn’t bum these crabby apples out so much.

Who is Lousy Cipher and why don’t I give a shite? The whole world is on fire. Ask Buddha. Karma is a bitch.

You’re mixing your Asian religions again, dabbadoo.

Sounds like MS is preventing folks from enjoying security with their product much as they detoured folks from other web browsers. Isn’t anyone gonna sue?

When Toby dies, I’m done until encryption is the standard for all digitized data. Like I want to PAY to get played. Find a way to tell your keepers you got a better business plan, or you got no brand but evil, Net. Stop selling our souls to Prism!!

Google warned me Microsoft was ripping off my internet sessions a few days ago. How DARE they move in on admobster territory? Someone should tell Lefty.

I was in court doing my duty, and now they fly the attendees names by like a flight schedule. Sorry I can’t help but skim a long list of names for button bings. One of my things.

Peter Licavoli III? Glad I didn’t miss that one! Now showing in courtroom 808. A REAL mobster’s grandson.

Reporters and business people may have special needs because they often a great deal and cross borders, all of which are opportunities for theft and loss. Which is what System Disk encryption seems to be about. There probably IS a great deal of meta data lurking in the Windows Registry and user directories. Linux gets around this by mounting user data on a separate partition.

We seem to accept that TrueCrypt is dead, but why? Why can’t we put pressure on them to open up the license or place it in the Public Domain? What reason could they have for not doing so? Is there a financial reason? Could they be bought out? First they kill off TruCrypt then stampede everybody into “Corporate” solutions like Bitlocker? This smells fishy.

There is a difference between Secure and Secret. I want my system to be Secure. I want my Secrets to stay Secret. Use reasonable encryption for security. Use higher encryption for secrets. I lock the door on my apartment, but I keep important papers in a safe deposit box at the bank.

Some other things:
1. AES 256 seems to be the current standard but it is getting very old. How about upping the key length? AES 1024? or AES 4096?

2. How about creating an international Crypto industry, by creating an open source Crypto Library for C and C++ maintained internationally as part of defense budgets? Then multiple consumer products could be built on top of the Crypto library. I’m playing with some Crypto libraries now, but they seem a little bit rinky dink, not built to industrial grade.

3. Our government just got “backdoored” allegedly by Chinese hackers. (Oh the sweet, sweet irony)

One of the first things I learned about encryption at math camp in high school is if you know the message that has to be decrypted, breaking the encryption of that message becomes trivial.

The problem with full disk encryption is that the system files on a computer are always the same. Some system files might not get updated for years. The NSA already knows the message it has to decrypt: The Windows system files and folders.

Full disk encryption is a good start, but make you encrypt your important files in a separate encrypted container and that the virtual memory of your computer is also encrypted. Virtual memory encryption is turned off by default. If the full disk encryption on your Microsoft Windows laptop is breached, the encryption keys in your computer’s memory that you used to create an encapsulated container might’ve been dumped to the disk unencrypted.

I hope this helps.

Full disk encryption is mostly useless and may give a dangerous false sense of security. This is important enough probably to write a whole article about.

The crypto attack you’re referring to is called the known-plaintext attack, and modern encryption that people rely on, such as AES in BitLocker, isn’t vulnerable to it.

If you want to be NSA-proof, don’t use a computer or don’t use the internet. Even the perfect full disk encryption solution will leave you vulnerable to countless other NSA-attacks extensively reported on this site (man-in-the-middle attack, 0-day trojan, router hacking, physical installation of spy hardware, …). Simply put: if the NSA targets you, they will get what they want eventually.

The average security minded computer user – like myself – wants to protect himself against data theft and dragnet surveillance from government agencies. A good VPN service and any decent FDE-package will protect you from that. But sure, if you’re afraid that some government agency will raid your home and will do whatever it takes to decrypt your hard drive, you may want to look further than Bitlocker yes.

Thank you for a great article!

“Based on what I know about BitLocker, I think it’s perfectly fine for average Windows users to rely on,…”

And here’s the catch !

What is your definition of the difference between a “average windows user” and a user who wants a SECURE system encryption ???

Is an average user someone who is OK with a flawed/broken security ? Because he is not so “paranoid” than a user who wants/needs a secure/unbroken system ?

There is no difference… a system is secure, or it is not ! There is no “for this kind its ok, for the other ones its not”… thats completely BS !

And explanations from Microsoft (or any other american company like Apple, Cisco, etcpp) is as good as any explanation from the corrupt government you have over there… they are all liars !

Well, this writer may say:
“If you’re an average citizen of the USofA, then its ok to believe them… but if you are really interested in security and freedom of expression (like Snowden, Assange, etcpp) then you may have some doubts about what they tell you !” ;-)

Glen, give this guy a kick, … it seems he likes to work for microsoft or the government rather than for the truth movement ;-)

Glenn doesn’t seem to have much say in the editorial decisions (or he doesn’t want it — which isn’t exactly something I’d find fault with if it were true).

I have nothing against Micah personally, but I find his ‘advice’ dangerous, as do most people who know anything at all about crypto. His non-advice articles don’t seem too offensive, but it’s probably a good idea to get him away from giving advice about things he doesn’t know anything about. Which is a kind way of saying, he shouldn’t be writing these articles, ever, at all, unless he learns a LOT more about what he’s writing about.

To his (mild) credit, he doesn’t say he DOES know, which is one reason I am putting the fault of this on TI’s shoulders (see the comment I just posted about external writers, something I’ve suggested before (though I don’t think those comments ever, strangely, made it through the editorial process, even though I’ve made them on several articles — c’est la vie, c’est la vie)).

Irresponsible assignment and editing. Too much credulousness on Micah’s part, but really, it’s hard to fault this (other than suggesting he just refuse to write articles like this at all, which may be the alternate suggestion, if anyone cared) — it’s NOT his topic of expertise.

(I should clarify, when I say he has nothing to do with it, I have nothing to do with Glenn or TI other than as a commenter in a whole different part of the world — I’m merely repeating what has been said multiple times by him and his friends on this publication and in the commenting section. If I am overstepping by having said as much, or I am wrong, I’d appreciate being corrected.)

I believe I recently read something that notoriously insecure options (like with Dual_EC_DRBG, that deterministic random bit generator,) must be ‘exterminated’, so that they couldn’t even exist as an option and contribute to a ‘catastrophic failure’ with regard to the use or security of crypto systems/solutions.

Also I believe I read that there was this claim, or maybe suspicion that Microsoft’s Bitlocker keep a copy of the key stored on the same computer, for use with law enforcement, or plausible deniability for ensuring that a windows user doesn’t lose his key and lose all the encrypted data. Anyway this seem to be a valid concern, in addition to there being backdoors, because this type of issue is probably *outside* the common language with regard to the function or use of ‘backdooring’. A kind of plausible deniability version of backdooring I imagine.

I think the Intercept next time should go ask for an official statement from Microsoft’s top management, if the keys for using or the means to open Bitlocker, if such is stored on the computer or elsewhere, so that this isn’t a plausible deniability version of backdooring Bitlocker. (The wording of this question can probably be improved).

Good point about Bitlocker.

Micah or Ryan-

As a representative of the ignorant majority who should have stopped reading at the “”fairly technical” declaration in this article, I have a request.

Most of us don’t have access to sensitive information any government will want, but may still want to protect our rights on principle since the democratic process doesn’t seem capable of forcing the government to respect those rights… it would seem that the “switch to Linux” conclusion should be heeded.

So, is there a recommendation for an easily understandable how-to guide for doing that?
Something your average grandmother can handle?
Including the process of encryption and “fixes” for the bugs discussed in the comments (which your average grandmother would not understand)?

If the answer to the yes or no question is a no, I would suggest starting off every article on this topic with that info.
But if giving the spooks the finger by growing the haystack can be easily done, there are probably a lot of folks who would do it.

It’s quite straightforward. Ubuntu is arguably the best distribution for those new to Linux. Let’s say that you have a computer that you want to install Ubuntu on. I don’t recommend that new users attempt dual boot with Windows or OSX. If it’s an older computer with slow CPU and limited RAM, I recommend Lubuntu, a variant that uses the lightweight LXDE desktop. For modern computers, you’ll generally want to use the 64-bit images.

You can see available installers at For Ubuntu, the direct link is For Lubuntu, it’s After downloading the installer image, burn it to a DVD. Then turn on the target computer, and put the DVD in the optical drive. Reboot and follow instructions. You can accept all defaults. Using LUKS for full-disk encryption is an option. Instead, you can just encrypt your home directory. But do not do both, because it may cause problems later.

That’s it.

The Ubuntu Project has repeatedly been trying to make gestures towards commercialisation. I’m not knocking them, but (unless you’re on a touchscreen — a lot of people also despise the UI they adopted a few years ago) I’d suggest Mint over Ubuntu if you want a *buntu-based OS.

Honestly almost any mainstream distro with Gnome — or especially KDE — won’t take much time for a Windows user to get accustomed to. OpenSUSE is actually pretty user-friendly, too, as is (actually probably much moreso) Mageia — I’d go so far to say a Windows user can switch to Mageia and find themselves comfortable in under a day. And both have easy to use, on-install full-disk (other than /boot, but that’s another issue) OS and data encryption as part of the install process. Most people (90+%, and that includes people who work for the government; non-cleared people, higher) of people really don’t have to worry about evilmaid attacks. If we’re talking lowest-common-denominator, this is probably it. That and adding in a BIOS-level password.

I suggested Ubuntu or Lubuntu because that’s what I know well, besides Debian. The privacy issues with Ubuntu — including online sources in Dash searches — only apply if you’re using the Unity desktop. If you otherwise like the Unity desktop, you can install the Unity tweak tool, and exclude remote sources. See I mostly use Debian for my VirtualBox hosts and VMs. But Debian stable is fairly conservative, and may include recent versions of stuff that you want. If I need Ubuntu in VMs, I tend to use the server release, and then manually install lxde-core with the “–no-install-recommends” flag. Then I just add whatever apps I need.

Make that “… may not include recent versions of stuff that you want”.

Fair enough, but however locked down or not something is on install is generally the MOST secure you can expect someone nontechnical (especially)’s OS to be. Expecting people to go that much further than they already go out of their way to do something non-Windows is doomed to failure.

(BIOS-level implies the level, not merely ‘BIOS’, and includes any of its replacements, to clarify).

@Mr Idiots:
Keep only the OS in your hard disk, and maybe a few files that you post publicly anyway. Rest all files, photos, etc. store in encrypted hidden folders (cryptkeeper) in removable disk that you must detach when not using. And don’t cross any borders with the removable disks, unless you are absolutely certain what to do in case they are pulled out.
Ubuntu is good as it is most widely supported. After the first install, log in and install gnome and synaptic, and then on next log in change the UI by logging into one of the gnome options, and you are back to the familiar UI. Then, get into System settings, Security & privacy, and clear all the check-boxes in Search and in Diagnostics tabs. Install bleachbit and use it regularly. And finally, the most important – download and install Tor. Always surf using Tor. Some websites like ban Tor nodes, in which case you click on the onion icon and repeatedly use the “New tor circuit for this site” option till you get through – that is if you are too desperate to read a Lousie paper that has Cypher content.

So you realise I am more than one Idiot? Touche. ;)

Just a quick comment — believe me when I say, I do NOT believe that is ‘best practice’. I actually have a decent idea of what ‘best practice’ is but given what I’ve read of peoples’ reticence to do anything but let an installer run in the background, I was going for simplicity first — get them on a better operating system first, then MAYBE we can convince people to do something more proper with their data and take more precautions.

The journey of a thousand miles begins with a single step, as it goes.

Well shit, I couldn’t figure out why my comments weren’t posting here and elsewhere — I forgot I was more than one idiot.

Just a quick comment that I don’t by any means think my suggestion is ‘best practice’ — I actually know what best practice is, but I’ve come to realise that, at least at first, the most you can expect people to do is start an installation up in the background while they’re watching Netflix (or what have you) or playing a video game in another room. That’s who my suggestion is really for. After a bit of acclimation, one can then push for better security. There’s a whole lot more that needs to be done than merely running an operating system with stronger, open source crypto.

As the saying goes, the journey of a thousand miles begins with a single step — and so it should.

BTW not sure why you’re so strongly advocating Ubuntu. It runs a ton of stuff out of the box and is way too connecty. Better to get people on something user-friendly but less service-intensive out of the box. (And that’s not even going into the fact that there’s been a lot more escalations/0day in Ubuntu’s non-kernel-land but privileged subsystems than on most other distros). I’m not going to bash Ubuntu but in general if you’re trying to get people away from something ‘bad’ to something ‘better’, it seems better to figure out first off what seems easiest for them to ‘get’ — and there’s nothing at ALL intuitive, especially for a Windows user, about Ubuntu’s default UI. Advocating they ‘install gnome later’ is something a geek would do, know to do, or know how to do — not most peoples’ parents (or even most people, period).

Better, if you want a Ubuntu-based system, to go with something like Mint (as I said previously) and pick a user environment that they’d find comfortable right off the bat; the default that comes with the (non-spin) Mint is generally a LOT more familiar to most people who’ve used Windows.

Ubuntu was pretty bad for a while after they introduced Unity, but now after a few updates once you customize it it’s pretty decent. And once you get rid of apport and tweak the privacy settings, it should be good. Yes, it does require a fair bit of tweaking, like adding and configuring firewalld, rkhunter, clamav, etc., but on the whole someone could figure out how to do it.

It’s for the support base for Ubuntu that I would recommend it for folks here as they would continuously need help in the initial stages. I did try out the other distros but ditched them for one reason or another. Fedora I ditched the day they started asking money for MS codecs, and I never tried to find out if they still do. Mint was okay, but did not allow me to customize much.

Also, I am addicted to compiz with rotating workspaces … But yes, I find Microsoft as a company getting involved with Ubuntu’s Canonical, and that’s not good. Mark Shuttleworth seems to be growing too rich for someone peddling free software, and like Zuckerberg, may turn out to have been receiving the patronage of GCHQ or NSA. Maybe someday I will migrate to Mint should Microsoft take over the development of Ubuntu. I have Mint as a virtual drive and use it sometimes in preparation for an eventual migration, but let’s see.

Also, I am saving these initial versions of the TOR Browser files for later use. Right now they appear pristine, but who knows how long that it will stay that way. Then I will dig out the archives. I suspect a lot of the security patches are more for the benefit of NSA than anybody else on the planet.

You seem like someone willing to do a bit of work for your OS. I’d highly suggest you try Fedora and use rawhide — Fedora is *way* faster at updates and patches, especially at pushing kernel-level fixes, and (imho) more secure on the whole. Korora, based on Fedora, is pretty clean and user-friendly and has good spins. Would be better though if everyone weren’t going to the interminably terrible, horrible, very bad systemd model (which is pretty much taking over, to the detriment of all). And (highly highly recommended) if you’re not already, I’d suggest you look into grsec. There’s some decent guides out there. Or just do hardened gentoo, which incorporates it a bit more seamlessly (though the recompiles can be a bitch).

I see the recommendations for TrueCrypt here, yet their SourceForge page ( ) clearly states that it is not secure. I’m looking for options, but not sure who to trust. Any help is appreciated.

Veracrypt, fork of Truecrypt:

Safest is to trust no one, including your own ability to stay anonymous or to keep secrets. Every software gets compromised sooner or later, either because of some inherent fault, or because an update is pushed through in the guise of a security patch, but is actually designed to make your system transparent. Your hard disk constantly interacts with the bad world outside, so however you may try it will be difficult to keep it out of someone’s reach electronically. On the other hand if you keep your stuff in removable disks that you hook up for short durations, preferably offline, and encrypt that removable disk, then you are a lot safer, though someone can still get to it physically. And if physical access to you and to your stuff is possible then anything is possible, including some enhanced interrogation techniques that even the most hardened terrorists cannot cope with.

In short, physical isolation is your best bet till someone finds out how to jump the air-gap. Perhaps some day we will be able to harness the power of quantum mechanics and devise data systems that destroy themselves when observed by any unauthorized person!

For intents and purposes, 7.1a is plenty secure — and certainly the best/only open-source, fully audited solution for Windows users that exists.

General Hercules

This website keeps trying to extract HTML5 canvas data from the browser. Could you please disable this mischief?

I have no idea what’s causing it but I noticed it too. I’ll try to figure out what it is and stop it.

What is this guys? What are the implications?

When you visit The Intercept in Tor Browser it shows this warning:

Of course using HTML5 canvases is used for other stuff that isn’t uniquely tracking people. Since I know exactly what types of tracking we have on this website, I know that it’s not being used for tracking. However, I still don’t like that it pops up a sketchy warning for Tor users, so I’ll figure out what’s causing it and get rid of it.


Apparently Tor shows that warning because canvas data is used to track people. That propublica article explains things pretty well for the layperson.

I didn’t think you were trying to track people with it (although there was that unfortunate fling with google in the beginning), but since you brought it up :) could you explain exactly what types of tracking you are using on this website?

It’s outlined in our privacy policy here but the short answer is we host our own Piwik server, which puts a js file in each page and logs information about visitors (IP, geolocation, browser, OS, etc.). If you have the DoNotTrack header set, we don’t track you at all. And actually, I checked, Piwik doesn’t seem to do any HTML5 canvas tracking, so it must be something else. (We may change how we do analytics in the future, but this is how it works now.)

Your site hosts some sort of script-requiring content on AWS. It also asks for twitter to allow js. Putting aside the bad idea of using AWS in and of itself, are you sure it’s not originating from one of those? I don’t have time to look.

I am pretty certain that someone is tracking all visitors to your site, either while it transits through Amazon servers or by one of the third parties. Which is why 1) I always use Tor browser, and 2) I supply a fictitious email that I never even try to log into to see if they exist. If you are not using HTML5 canvas, then it worries me – Who is? While Tor screens out HTML5 data requests, the other browsers don’t, so it’s a matter of concern for every visitor to your site.


Instead of making tracking opt out, which penalizes not being a privacy geek, why not make tracking opt in? Why should normal people have to recite some obscure incantation just for a little privacy. And let’s be honest, ‘do not track’ is nothing but “pretty please.” Just one more data element that can be used to fingerprint us. It is actually insulting when you think about it. The forced verbalization of weakness and submission in order to avoid even worse punishment is an old time bully tactic.

I assume that you aren’t tracking people who submit things through securedrop right? So, the deal is either we leak information to you or you will track us–or does everyone get tracked–leaks or not?

The difference between how you treat leakers and how you treat readers should give great pause to potential whistle blowers.

I would much rather leak to a place that values privacy itself, rather than a place that only sees privacy as an enticement to leak. Something to be traded for information.

It seems one way or another–only the powerful will have privacy.


Since I doubt that they are redirecting their webserver’s logs to /dev/null (grossly oversimplifying), I think it’s safe to say it’s all tracked anyway on their end, just a matter of degree. Guessing you know this, but maybe not all visitors here do. Anyway it’s probably safe to assume all traffic to/from //TI is monitored externally to the actual server. My guess was AWS as well.

@Useful Idiots

Yeah, “do not track” is a term of art that, contrary to the plain meaning of those words, somehow excludes webserver logs and all the derivative data extrapolated from that. That is the art part. I think the logic is–we were doing this before you started complaining, so it must be ok.

And I have no idea why they are using amazon, or why the comment section wants to pull crap from I get the feeling that TI doesn’t either. I would love to see websites ranked by the number of third parties surreptitiously allowed to spray executable code and web bugs–with bonus points for those sites that are paying for the privilege.

This free T//I service model with no advertisement to collect money isn’t exactly a kind of philanthropy that I would associate with a businessman like Pierre Omidyar. An occasional Ukraine article or a couple of bogus ones on night-vision goggles would not be the ultimate objective of this expensive adventure. Neither would advocacy of blacks and Muslims fetch any long-term monetary benefits, given the kind of adverse comments that show up.

If so, what is the catch?

@General Hercules and lastnamechosen:

I don’t think it’s per se malicious so much as it’s convenient. There’s a certain trade-off that comes with not only reading this site (probably gets you on *a* list, just like WL visitors were (and probably still are) — that list is probably not really consulted much except in association with other lists, if you go by a selector-like criteria. Then you probably have a separate list for people who post/reply, separated into lists that support various factions or protest various factions.

To assume that TI is some sort of attempt to do something malicious in and of itself is (imho) over the top. But that doesn’t change the fact that any site that discusses topics that TI discusses can be *used* as a watering hole to track interest and people (external to TI’s servers, and unrelated to TI even being aware of it). One needn’t use third party sites when you own the pipe, and one can (given capabilities) easily sideswipe the connection in the first place, depending on the geographical location(s) of the visitor(s).

All this said, I think it’s safe to assume that not everyone knows this stuff. But I’m not sure it shouldn’t be (more or less) common sense to know it’s a decision to possibly wind up with your IP on some list (or not) by posting here and stating any beliefs or convictions with any degree of tenacity. So there’s really a few things at play, here. I personally would be much happier if there were no webserver logs on the server, but I recognise that’s probably wishful thinking (and it makes it very difficult to troubleshoot the server or locate security violations, on a server that probably has several big fat targets on itself already).

Third party sites though, with no ability to see those logs or that information, really don’t (to me) feel like a safe bet for TI. I don’t recall them being used up until fairly recently and when I noticed it I was surprised. The twitter one bugs me more than AWS in a way because that’s sort of like saying ‘okay, twitter, you can track people who visit my site’, which brings up the government vs corporate spying stuff, and third party legal procedure (what there is of it) also comes into play as well. IMHO the only way to minimise risk is to minimise external reliances and lower one’s attack footprint. With external reliances, there is no transparency for the user or the site, and there’s no real way to ascertain that what you’re seeing is what everyone else is seeing — it’s easy to inject code anywhere along the line, and SSL isn’t really a protection against that.

Do I think that this concern applies to the vast majority of visitors? Generally, no (though it might, for all I know). I’m more concerned about people who comment, and people who might have already been seen as ‘interesting’ by governments due to their political opinions or leanings, which surely includes at least a slice of the pie as far as visitors here are concerned. That’s exponentially more true if you’ve had your browser fingerprinted by some third party and later decide to leak, I’d think, but that’s just a guess.

@Useful Idiots & General Hercules

I certainly don’t think there is some conspiracy at The Intercept. All of this crap is just, at this point, standard corporate IT bullshit. For some reason a very large number of people in the computer industry think that it is their job to track everyone’s every move. It just seems to come naturally to some people, as if computers were invented for this reason.

Even at the Intercept, absent commercial and advertising interests, this shit keeps coming back. Look at the stark dichotomy between securedrop and the TI website. They both can’t be true at the same time. The idea that you can tack privacy for whistle blowers onto geolocation and analytics is an exercise in marketing and self deception. Celebrity spokes-cadavers for privacy and javascript from twitter. It’s a fucking party. Securedrop is pushed as a product, not a philosophy. There is no passion.

The internet is so balls deep in analytics and trading hidden bathroom cam video with third parties that it seems fundamentally incapable of reporting on privacy and spying.

@Mr Idiots:
I agree it’s nothing malacious, just a convenience that someone sold them. Probably NSA used the Amazon for its own purpose? I see the same thing happening at the site as well. If so, NSA is just wasting time and resources.

BTW the only reason javascript from twitter is getting loaded on this page is because this specific article embeds a tweet. Other articles that don’t embed tweets don’t load any content from twitter. Likewise, if an article embeds a youtube video, it loads content from youtube.

The HTML5 canvas data culprit is WordPress 4.2 trying to support emoji. It’s not a privacy issue at all, but I’m working on getting it removed anyway.

@Micah, I’d suggest you do the more privacy-conscious thing and do a screenshot (including timestamps, hashes if need be) of tweets. That also makes it a bit easier to stop back-pedaling (though admittedly I am not very well-versed on the Twitter API and in fact I avoid Twitter as much as possible — it’s one reason I was so quick to point out the twitter issue — I assume that by embedding tweets, it will remove any embedding if a tweet gets deleted).

I’ve never understood the WP thing. There’ve been wayyyy too many WP exploits. I noticed it in your source, not a site fingerprint, some time ago. I’m actually kind of surprised that hasn’t been a problem (yet?).

Sorry, I’ve been leaving a lot of scent around the trash cans. Wild Weasel out to get SAM to show us his cards.

Why do you sit on trash cans before discarding them? No wonder they pick up your crappy stink.

Louise Cypher loves your comments, so henceforth please inundate her with lots of comments.

This is so much bullshit… it will/may protect you from ordinary crackers, but the fact is MS will not comment on lawful requests to decrypt hard drives says it all– there is a back door and we’ve known that since encrytion was treated as a weapon…. “I asked Microsoft if the company would be able to comply with unlocking a BitLocker disk, given a legitimate legal request to do so. The spokesperson told me they could not answer that question.”

General Hercules

Remove the hard disk altogether and use TAILS usb to boot up. Store all personal files in multi-layered encrypted folders in removable disks, and along with them add some really malacious virus and rootkits with “very-suspicious-names.doc” such that if any other person tries to read the files they would wreck the removable disk and simultaneously their own system.

General Hercules

In order to boot up with TAILS USB drive, you have to enable legacy boot option in newer machines?

Here is what i have learned about Bitlocker. When you encrypt a drive the system will reboot first to test. If you have an Ethernet cable unplugged that test will fail.This makes me think the password is being sent to Microsoft. What i do is unplug the Ethernet cable,disable the test and encrypt the drive. When it is finished i shut the system down,and let my ram go dark before i reboot.

This is a great point – and it is probably not the password being sent to Microsoft but the encryption key – for the U.S. government (and their own use…other govts). This is why Bitlocker is not a viable option for privacy / security.

Talk about a honey pot though, the bad guys would definitely want this database of keys.

Micah, could you respond to Marcus’ assertion? Is it true? If it is, what _exactly_ is being sent over the wire as part of the test?

For those keeping score, here’s how a report of sending unwanted data played out with Chromium browser (on which Google Chrome is based):!msg/chromium-discuss/V0x5LyOrL_0/1MgA8vIfuooJ

* by the 4th response, Alexey posts the offending code that sends requests to Google. Since that code can be removed and the entire program re-compiled, anyone in the world with a computer and internet connection can test the problem and resolution for themselves.
* Google calls the code a “regression”. You can judge the veracity of their “regression” claim for yourself. But the point is, there was a clear conflict between financial incentives (mining data from the user) and user privacy (not doing this). User privacy won, at least in this case. (Chromium is a huge code base and there could be other problems.)
* in 3 days the issue had been fixed. If it had taken 3 months (or even 3 weeks), you can bet someone would have threatened to fork Chromium. Successful forks (like Joomla) have occurred over less pressing issues.
* because we can see the code, the community itself controlled the timing of the report, confirmation, analysis, and even suggested resolution of the bug. For that reason, the timing of the report was as fast as humanly possible. (Again, look at the time stamps on the posts.)

Very curious how this accusation of the proprietary software Bitlocker sending data to Microsoft will compare to that one.

Micah, could you respond to Marcus’ assertion? Is it true? If it is, what _exactly_ is being sent over the wire as part of the test?

I really don’t think Marcus’ assertion is true (especially since BitLocker doesn’t use encryption passwords, but rather stores the key in the TPM), and he didn’t provide any evidence other than just posting a comment. But it should be easy enough to test, and if it were true it would be easy to collect damning evidence. Do you have a link to a technical write-up about this?

If it’s “easy enough to test”, then please tell me– is there anything going over the wire when you install Bitlocker and sniff with Wireshark? You’ve suggested Bitlocker in two different articles, so I assume you have a machine on which to test the claim. (I don’t have a Windows machine.)

I know that sounds like I’m being a snoot. But with the Chromium example I cited, “easy enough to test” meant that an unverified claim by a frustrated user was tracked to specific lines of offending code in a few hours… by that _same_ frustrated user! That was step #1. Chromium’s quick response was step #2. In this case we can’t even arrive to the end of step #1. Without that Bitlocker isn’t even in the same ballpark as the FLOSS software mentioned in your article.

I do think Bitlocker is a good idea in the same sense that opportunistic encryption is: both are better than not doing anything at all. But don’t tell the user a single word about security, privacy, or anything else. Just suggest it as a default part of a _minimally_ functioning system. Is that not the necessary conclusion of the Schneier quote in your article?

The only solution for personal files is physical isolation using a Chinese-made, removable disk with hidden, encrypted folders and files. Individually, none of us have the resources or time to keep up with the cat and mouse game.

The other day I noticed my keys had slowed down to a crawl. Obviously, a key logger was active despite all the precautions I take and the root-kit checks I regularly perform. So I just re-formatted the entire disk, tweaked the partition sizes and re-installed the operating system. Now the keys are running like Usain Bolt.

Publius (C5315762-7D4C-42C9-9649-8147EFF194C0)

I suggest that people form their own opinions, guided by articles like this and their own research.

A coupla points:

1) This article talks about Microsoft collaborating with the government. It seems that all companies and individuals are subject to some sort of government / national state control (including the legal power to force them to lie and keep silent). So your choice, arguably, is which government do you want to maybe have influence on your software? US, UK, Russia, China, Australia, India, Zimbabwe, Bhutan… Recent UN suggestions may change that, eventually.

2) Open source is touted as a great benefit. Poodle, heartbeat and the rest suggest that this can be a meaningless protection. Arguably declaring the encryption in use helps the bad guys.

Great article thanks. (Though it has it’s biases, it is a valuable part of the debate.)

Version: GnuPG v2


Security through obscurity does not work, it is trivial to find out what encryption is in use.

Furthermore the vulnerabilities like Heartbleed affect commercial vendors too, they often just use the FOSS stuff in their own stack. There is zero benefit for going commercial security-wise, it is impossible to trust anything that isn’t the source code. Corporate and state interests cannot be trusted at their word.

You did some nice research here. Yet, the ending is disappointing. Commenter “Anon” at 12:56pm sums up good reasons not to trust anything Microsoft builds for crypto with a long list of their failures and misdeeds. That should be the end of the whole discussion of whatever they built. From there, you have an alternative that’s endorsed by a well-known cryptographer. That’s the best recommendation for Windows crypto. Then you finished with the best recommendation of all: get off Windows.

Problem is that you put way too much emphasis on the positives of Bitlocker toward the end. It reads like an advertisement. Here we have a company constantly deceiving us and covertly working with NSA to compromise their customers. They share source code with Russia and China with likely the same effect. There’s at least one good alternative to their product without such a history. Yet, you spend so little time on the alternative and so much time pushing the sneaky offer. Makes little sense.

It also hurts The Intercept’s credibility in INFOSEC. Leakers are expected to trust you to expose misdeeds of companies and governments while your organization tells them that trusting such corrupt groups’ offerings is perfectly fine for the average person. Anyone believing in the “vote with your wallet” concept should know that using them instead of better alternatives financially aiding the types of people your site writes exposes about. Not “fine” at all. Avoid Bitlocker and use BestCrypt if possible.

Nick P
Security Engineer/Researcher
(High assurance focus)

Ha, I definitely didn’t intend for this to be like an advertisement for BitLocker.

There are no perfect options for Windows disk encryption. Given the list of options (BitLocker, TrueCrypt and its derivatives, BestCrypt, DiskCryptor, and Symantec Endpoint Encryption), given what I’ve learned it seems like BitLocker is a decent option. BestCrypt may be better, but it also doesn’t come with Windows, costs $100, and it’s just a maybe. I haven’t heard any glowing reviews of Symantec. And Truecrypt, its derivatives, and DiskCryptor all have compatibility issues with UEFI and secure boot.

And Truecrypt, its derivatives, and DiskCryptor all have compatibility issues with UEFI and secure boot.

… and that’s why we need open hardware.

Or you need to make the necessary sacrifices to not depend on such vendors. There’s crowds that have been using RISC alternatives for quite some time. PPC Linux, Amiga, Solaris on SPARC, Linux/BDS on many, and so on. Much work in porting useful stuff (eg browsers) to such systems. Yet, if you do, you can get more hardware assurance by (a) diversity, (b) using ISA’s like SPARC with open specs (standard) or hardware (see Aeroflex Gaissler), and/or (c) using ISA’s like RISC-V with 1.4GHz 48nm SOI implementations with source available. People just need to roll up their sleeves and start a massive porting effort. Also, app developers need to keep making things very portable.

Note that China already took steps in this direction with their Loongson MIPS-based processors (open to *them*), open source kernel, and open source apps. RISC-V and OpenRISC communities are at least trying but we’d need a similarly government-sponsored effort. DARPA, most likely.

Not accusing you, pal, as much as saying it comes off that way. Just to be clear. So, it really comes down to trusting a subversive company’s product (Bitlocker) or paying $100 to have the same with maybe less risk. That’s how you illustrate it to readers. My experience fighting TLA-type threats is that the solution that reduces your risk against them usually involves one or more painful sacrifices. One rule to live by though: never trust their close partners to provide you something that won’t betray you. Anyone critical of the U.S. government or law enforcement should avoid any product they’ve likely subverted and integrated into mass collection. Microsoft software is *high risk* and you never know if you might land on something worth leaking. Best to keep snoops out of one’s life where possible so all have little to nothing to work with in any conceivable situation. Avoid Bitlocker or avoid anything after Win7 while you can. Besides, Win7 is still the best one and gets support. :)

Thumbs up on Win7 + FDE (TrueCrypt) if you absolutely MUST run Windows. I don’t know many people that MUST run it as their main OS, but anything after 7 is asking for much more trouble than just using 7. And 7 is much faster and plucked through by now, while still being maintained).

Still wouldn’t advocate Windows, but least worst of available evils.

Thanks Nick P. Had the same thoughts yesterday but decided not to comment. Microsoft’s primary interest is in capturing markets and excluding alternatives to their products, not creating reliable code. If the problem is trust, then there is none (of either).

“Microsoft says the diffuser was too slow and kept BitLocker from being activated by certain users, including government contractors and agencies that must comply with Federal Information Processing Standards, or FIPS.”

This is a very unconvincing explanation. Why not then make Elephant Diffuser optional?

To weaken its security, just like their NSA masters order them to do.

Yes and solid state drives (not popular when Vista and Windows 7 came out with the diffusor) are common now and much, much faster than the old spinning drives – so hardware performance has been drastically increasing.

The logical explanation is Microsoft was asked to make their disk encryption more susceptible to brute forcing and did so (despite the fact that Bitlocker phones home when turned on – sending encryption key?).

Solid state drives are also almost impossible to really completely wipe — something I think most people don’t know or think about when they go from non-encrypted to encrypted but keep the same physical drive. And (unless they have magical mystery methods that make it go super-mega-fast) as far as I know, to make matters worse, most encryption solutions, including Bitlocker (but not at all exclusive to it) don’t fill the entire disk with random data during formatting. If one were being pedantic the smarter move is to buy a brand new, unformatted solid state drive to install to if you’re moving from non-encrypted to encrypted operating system.

Micah, thank you for reading the comments on the previous article and following up.

As for your final recommendation for the general citizenry to use Bitlocker – I believe that is very flawed as it gives the user the impression they have a “secured / private” hard drive when they most definitely don’t. Tell people to use your other choices or don’t use anything at all (so they don’t fool themselves into thinking they have privacy from their (and other) govts).

As several users in this comment thread have pointed out – Bitlocker phones home to Microsoft when you turn it on (if it can’t make the network connection it won’t be enabled successfully….think about that) – the obvious thing happening here is that its sending the encryption key back to Microsoft for storage and passing out to whatever world government security apparatus requests it. Removing the elephant diffuser, but not keeping it as an option, is an obvious way to make Windows 8 and higher Bitlocker more susceptible to brute forcing – especially when solid state drives are now much, much faster in performance than when the elephant diffuser was in use. The Security States of the world are certainly looking forward to the massive transition from Windows Vista / 7 Bitlocker to Windows 10, getting the encryption key from Microsoft may require an NSL (and a paper trail, secret, but existing), but brute forcing certainly wouldn’t.

And very importantly, Microsoft (whom we know is an active partner with the NSA) was instrumental / prime mover in creating and pushing the UEFI specification as a BIOS replacement specification that you tout as being so secure – the popular PC version was finalized in 2007 (v2.1 I believe), long after the NSA declared war on citizen privacy and system security – it is closed source with large input from Microsoft and Intel (both NSA “partners) and should, at best, be viewed as suspect technology…thank goodness we can turn it off at this point (although that option is being eliminated in newer systems now).

All MS recent OSes try to connect out on first boot and in any reinstall — it’s not just with Bitlocker. What’s worse they basically have wireless enabled from the get-go on that boot (shouldn’t any secure OS have it disabled unless explicitly enabled? Probably knocks off almost all OSes, there). Android is bad for ‘requiring’ connectivity too on afaik the majority of devices. I think you were the one talking about testing this — did you try tossing a machine with Wireshark or tcpdump or the like onto your network (promiscuous mode) and try to capture the datastream? I’d be curious what the pcap would show it’s doing.

The article you linked to for the assertion that Linux/LUKS is not as secure as BitLocker does not hold up for that very broad assertion. Instead, more precisely the article demonstrates that:

1. for older versions (not current) of cryptsetup (i.e., before 1.6 or 2013)
2. for known installation types (e.g., Ubuntu 12.04 with crypt & LVM)
3. then an attacker can reasonably determine the location of a known plaintext and use that to inject code into the system.

Now, cryptsetup has for at least 2 years used aes-xts-plain64 as its default, which should be sufficient to negate this particular, specific type of vulnerability.

It’s also pretty trivial to change the installation process to negate this by denying an attacker the opportunity to predict the location of known plaintext. Two obvious ways to modify the installation process:

1. Encrypt the partition, create an LVM physical volume, create the logical volume, encrypt the LV, create the filesystem. Double encryption takes away the ability of an attacker to predict plaintext on the device. The downside is that two encryptions can be slower.

2. Encrypt the partition, create TWO LVM physical volumes on the partition (using –datalignmentoffset) of roughly equal (but unpredictably unequal, within 10%) size, and stripe the logical volume across the two PVs, create the filesystem. The location of plaintext should be much more difficult if not impossible to predict, even for two installation of exactly the same distribution and version.

Finally – this illustrates the value of having free software over proprietary software like MS/Windows and BitLocker. It is easy enough to defeat the vulnerability described in your linked article; Linux and LUKS are therefore NOT inherently insecure. On the other hand, it is not possible to fork BitLocker to ensure using a particular PRNG, nor re-enable the diffuser, because MS/Windows and BitLocker are proprietary. Therefore you can’t reliably assert that BitLocker is protecting your data, and you can’t fix its vulnerabilities.

You should really be suggesting Linux and LUKS as a primary solution, and BitLocker only if you can’t migrate off MS/Windows.

I’d choose LUKS over BitLocker any day. It’s open source, it’s transparent, I trust it much more. But this article is about disk encryption options for Windows, so obviously LUKS isn’t an option without people switching from Windows to Linux, which most Windows users aren’t going to do. But they still need disk encryption.

I said that LUKS was vulnerable to modifying ciphertext attacks “by default” — is this not true anymore? I knew that you could configure LUKS to use XTS instead of AES-CBC, but when you’re installing Debian and choose “Guided – use entire disk and set up encrypted LVM”, I don’t think it does this. Is this not true for popular distros?

As of cryptsetup version 1.6.0, aes-xts-plain64:sha256 with 512-bit keys is the default. (

I have confirmed this with fedora, ubuntu, and arch linux. Cannot speak to debian, but it stands to reason that debian uses at least version 1.6.0 of cryptsetup since 1.6.0 was released 14 Jan 2013 (

This is great news — I’ll look into it more myself.

From cryptsetup’s latest version man page:

“cryptsetup –help shows the compiled-in defaults. The current default in the distributed sources is “aes-cbc-essiv:sha256″ for plain dm-crypt and “aes-xts-plain64″ for

If a hash is part of the cipher spefification, then it is used as part of the IV generation. For example, ESSIV needs a hash function, while “plain64″ does not and hence
none is specified.

For XTS mode you can optionally set a key size of 512 bits with the -s option. Key size for XTS mode is twice that for other modes for the same security level.”

In other words, it not only isn’t vulnerable to the AES-CBC attack, but it can use a 512 bits key size. Which, in XTS mode of operation roughly translate to a 1024 bits key size. Not only it’s immune to the attack you mentioned, but much safer, cryptographically wise.

Also, linux has much more evil maid protections than microsoft has. BitLocker relies on SecureBoot and TPM only. On linux you can have many other things, including those, and even roll out your own checks such as: and If you are lucky enough to run coreboot, you can have an almost (discounting firmware blobs) complete chain of trust over your computer. This is a much higher degree of control and security over microsoft bitlocker. I don’t know if you personally uses Windows Micah (I hope not), but this article really struck me as a sales pitch than anything else. Your footnote mention of Linux isn’t convincing enough. I see a windows user reading this and thinking to himself: “Heck, this thing is secure”. The fact is that we don’t truly know. The sole fact that microsoft maintains a compromised PRNG, even if not enabled by default, is highly suspicious to me. Not to say intentionally deliberate. I think that an update regarding LUKS security is in order, because you didn’t got your facts straight in the article.


Your points are well taken, Gra (post below), but I think you may have glossed over an important point Micah made above. Personally, I would not rely on Microsoft for security and privacy if they paid me to use their software, much less if I have to do the paying. That being said, an enormous fraction of endusers run Windows. While I agree that the article has an overly optimistic analysis of bitlocker, and that it could have done a much better job in describing LUKS, the fact is that it is simply not feasible to part the Sea of Proprietarity and lead a mass exodus to UNIX-like systems. Nor would such an exodus necessarily be desirable. Regardless of whether you agree with Micah’s conclusion in recommending bitlocker (I do not), for most users LUKS is not a viable alternative, and thus if we wish to challenge Micah about his conclusion, we must do so in the context of providing a suitable alternative WITHIN the Windows ecosystem. Ending the conversation at LUKS is a futile masturbatory effort that will not help us reach average users with cryptography.


I believe that your entire premise is wrong. Assuming that windows users couldn’t or wouldn’t migrate to linux or other unix systems is a presumption many people share. But it’s a faulty one. If you take your average person a machine and give them both windows and linux to install, they will have troubles with both. Perhaps they won’t even manage to boot their installation media. And, in the context of the first article, and now this one were given, which can’t be denied has a focus on journalists protecting their sources, whistleblowers and others, suggesting that they remain using windows is not ideal. The fact is that windows still:
a) uses plain AES-CBC even though they know it has a viable attack on it.
b) Still have a PNRG that is compromised, even though they claim it’s not used by default.
c) Couldn’t answer if compelled by LEA, they could unlock a bitlocker encrypted hard disk.

These alone should be sufficient to end the conversation.

Understood, Gra. Your argument, if I understand you correctly, is that bitlocker is so thoroughly broken, or at least backdoored, that, especially given an average enduser’s ignorance concerning both linux and Windows internals, staying with Windows and using bitlocker cannot be warranted.

I would challenge you on a few points. You may be right that if I take, for example, my mother a Linux mint machine and a Windows machine that her ability to navigate each would be roughly equivalent. However, OS decisions are not made in such a contextless vacuum. Consider a business whose enterprise infrastructure is already Windows-based. Consider people who already use Windows and, for whatever reason, do not want to switch to a UNIX-like system. Consider those whose work requires software that does not work with WINE and that has no suitable open source alternative. When choosing an OS there are a myriad of possible motivating factors, some rational and some emotional. I think linux is the best option, so I run linux. But if, despite any arguments presented and despite understanding the security and privacy implications, an enduser still chooses Windows, what then? Should no attempt to assist them with information security be made because they made a choice that we may consider unwise?

Thanks for pointing out that LUKS hasn’t used AES-CBC since cryptsetup 1.6. I have posted a correction.

I’ve never said that the Bitlocker IS broken. I’ve said that we can never know if it CAN be broken by microsoft if compelled by LEA. Even if it used the latest developments in cryptographic algorithms (which it doesn’t), we couldn’t trust it completely because it’s code can’t be audited. The problem here is, and now I’m talking about the context of these articles which can’t be denied are journalists, sources, whistleblowers, etc. For them, their security threat model can’t accommodate for these uncertainties. And this is where I disagree with the whole piece. Instead of trying to make the best of windows, the intercept should write more pieces about TAILS, OpenBSD, etc. Even linux itself isn’t the most secure OS choice. But it’s way better than windows and better than Mac OSX. For these people, the difference between a good and a bad choice can mean their incarceration. Sure, for the regular Joe which doesn’t has a choice, doesn’t know better or doesn’t even want to know better, suggesting that he uses BitLocker or BestCrypt is better than nothing. And I believe that it’s nice that Micah got microsoft at least to answer some of the issues raised by him and many others. And I also believe that it is newsworthy material. But the way the piece was written, as I already mentioned, can make the user believe that it’s a good option, when, again, in the context I put above, clearly isn’t.

I would just assume it is compromised, why waste time debating it.
In addition, Microsoft’s sudden support of OpenSSH may be a good thing, or may be something else, in any case why would anyone trust a large corporation, much less a large American corporation.

OK now here’s why you should disregard this article:

If something really bad happens that prevents your bootloader from loading properly, the software meant to keep your data safe becomes your data’s tomb. You’re locked out and there’s no way back in. Your Linux user friend MIGHT be able to rescue some files but in my experience Truecrypt and Bitlocker have bricked some perfectly good hard disks.

If something happens to prevents your bootloader from working, you can still recover data as long as you have a copy of your recovery key.

One of the notable cases that made people take notice of TrueCrypt was the case of the Brazilian banker where no-one was able to forcibly decrypt his data.

Based on that, if BitLocker were to contain a ‘backdoor’ that would allow data to be forcibly decrypted, wouldn’t that become public knowledge pretty quickly just by looking through court documents of similar cases?

No, not at all, and in fact the opposite is true — the less you hear about them beating the drum about not being able to get into a machine, the more you can generally assume they have gotten into the machine. They just don’t even put it in the documents at all. It’s been demonstrated, anyway, that most peoples’ passes are ridiculously easy to crack with the FBI (not the NSA)’s level of resources. Anyway most people appear to just give up their passes long before that even happens.

I use encryption to protect my personal information, most of which the government already has (SS# et al) from identity thieves that usually shoot for low hanging fruit.

There are enough people out there that use no encryption at all and that means that BitLocker (On by default on my Win8.1 machine) makes me less of a target. That is enough for the data I am currently protecting.

I am very glad that their are others out there pushing for stronger encryption and speaking out on this, but BitLocker on Windows and File Vault on OS X meet a lot of peoples needs with very little investment of time and money. Thanks for the information.

It’s like the old joke about two men in the woods and one asks, “What happens if we meet a bear?”. The second replies, “I’ll just run away”. “But you can’t outrun a bear”, the first fellow tells him. “I know”, his companion answers, “But I can outrun you”.

This article sounds like a paid ad from Microsoft – all the jibber-jabber about how Windows would never ever use the NSA backdoor Dual_EC_DRBG because it’s not default, instead it uses this other RBG called CTR_DRBG, with no further explanation about this mysterious new RBG I have never even heard about, and I follow this field rather closely.
The best part is the entirely ludicrous conclusion that since it doesn’t use the backdoored RBG but instead uses this other method of which all we know is its name, “your keys are generated in an entirely secure way”. LOL, allrighty then, thanks for clearing that up!

I have a great idea ! Lets encrypt the entire operating system ! That way the NSA will have a copy of these files before encryption , and after encryption (the files are in the public domain , everyone has a copy on their computer ) and a big head start on breaking the encryption !

yeah right.

Put all your stuff in one folder , or on one usb stick , and encrypt the heck out of it. No need to encrypt files that everyone already has an unencrypted copy of.
Besides. i really don’t think anyone cares how you scored in world of warcraft this season. The number of files needing encryption is going to be few.

What you’re talking about is a known-plaintext attack, and modern ciphers aren’t vulnerable to it.

I get so tired of “security researchers” using the many eyes fallacy about software. If open source was so great for security, why have there been so many flaws that have been there for *years* discovered in OSS software lately? Also, Microsoft’s “cozy relationship with the government?” Really? As opposed to Google’s? One of those companies was sued by the DOJ, one was not. Microsoft goes to extraordinary lengths to protect your data. Google monetizes it.

Someone needs to go over there and review the code and put this nonsense to bed.

“Microsoft goes to extraordinary lengths to protect your data.”

What evidence do you have to support that? What is your response to this:

My response is the same one that Microsoft has already been very public about… if there is a lawful request, they are obligated by those laws to comply. When the law is unclear or non-existent, they push back… just like the Ireland case. I also respond by pointing out that case had nothing to do with backdoor access nor BitLocker.

If the only fear you can legitimately have is whether there’s a government backdoor, however, my advice: don’t break the law, and there’s no reason for them to snoop. In the meantime, you fight for legislation that alters the parameters of how those governments operate!

And it’s not a matter of trusting the government… because I don’t. It’s a matter of not being an asshat about things.

Again, you’re just taking their word for it (while admittedly presenting on single point of rather superficial counter-evidence), and Microsoft’s word is insufficient for my trust. And no, I don’t trust my government, nor my government’s other symbiotic corporate entities and other “partners” any more than I trust Microsoft.

Finally, your “if you have nothing to hide, you have nothing to fear” admonition is a little stale at this point, don’t you agree?

“if there is a lawful request, they are obligated by those laws to comply.”
Then there is the fact that we know that NSA has shown that its interpretation of what is “lawful” is often highly dubious.

“my advice: don’t break the law, and there’s no reason for them to snoop.”

What do you think everyone’s been screaming about? Which one of the myriad “secret” laws is it you advise us not to break?

I smell sophism here. If government wants to legally access YOUR data they can get it only from you possesses it. If they want to legally obtain the YOUR data from somebody else they can do it if somebody else is in POSSESSION of such data. Questions: Is Microsoft in possession of your data? or government is in possession of hardware? but not (meaningful) data until it is decrypted. Who owns the data on your computer? Microsoft? or you? If you, you can refuse to cooperate as suspect legally and they will never get your data. If MS owns your data then privacy is IMPOSSIBLE since what you want to encrypt is NOT your data. These basic constitutional issues about legality of government seizure of the data are never discussed since status quo is convenient for government.
Sadly in freedom loving America there seems to be allowed no private ownership of anything what rampant civil forfeitures are living proofs of. Exactly like in Soviet Union, no privacy no private property.

You’re asserting that Open Source software cannot be secure because we have found flaws in Open Source software, yet you attempt to make the argument that Microsoft will protect your data because they are not Google? I’m sorry, but Microsoft monetizes your data just like Google does. Google might be more well-known for doing it more efficiently and aggressively. but Microsoft still does it.

Open Source software is praised by security researchers as more secure because it gives the rest of the security researchers the ability to inspect the code for security holes and inefficient code. That alone screams a level of transparency that isn’t available in proprietary software that Microsoft has prided itself on for so many years. Hell, even Microsoft has been hinting lately that they are becoming more incentivized to work in the Open Source domain.

I didn’t mean to give the impression that open source software is magically more secure than proprietary software, just that it’s more transparent, which is extremely important when evaluating trust. In the end, I end up recommending BitLocker despite its transparency issues.

The many eyes fallacy is certainly a fallacy, but that doesn’t mean open source doesn’t have legit security properties that proprietary software will never have, such as the possibility for reproducible builds, etc. I also explicitly point out that BitLocker protects against evil maid attacks, whereas the open source LUKS doesn’t, and that when BitLocker removed the Elephant diffuser it reduced the security down to the level of LUKS. I didn’t mention Google in this article.

No it is not a fallacy, with everything else being open, open will always be more security than closed. Security through obscurity, THAT is a proven fallacy!

Has anyone looked back to the Microsoft antitrust case? The DOJ closed it whopping 5 working days after the enacting of the patriot act?

What about the trade act’s apportionment of our liberty to global telephonies? TALKTALK WANTS TO TALK! Who’s on that piece of crap? This is a protection racket, and all entities are in on the grift. Just bear it? I don’t think so.

One nation’s crime is just another’s spying? Sure we want to enshrine that shite?

I am so pissed, Slipknot’s knuckehead ripped off my rant about aliens steering clear of US because our entertainment sucks. I said once they got a load of what we thought we needed to know in Prism’s dated base, they sped off to leave us to contemplate more navals. Yeast has not advanced enough yet. Watson, go fuck yourself. Bake ON!

As one of the people who criticized you in the original article, I’m really glad you read the comments and made an honest follow up. Thank you and keep that up.

My opinion of Bitlocker however remains unchanged. Let me share why:
1) The kind of reassurances I hear from Microsoft I have heard before. In 1975 the new encryption standard of the day was called DES – the digital encryption standard. The creators of the algorithm, ran it by the NSA, and they offered up a few suggestions to changes. First, they fiddled with the S-boxes to make the the algorithm more secure. Unknown at the time, was that the NSA developed a method called differential cryptanalysis which would have rendered the algorithm useless with poorly selected s-boxes. Chalk one up for the NSA. However, they also suggested reducing the 64-bit key size down to 48-bits. Ultimately a compromise of 56 bits was agreed upon. The algorithm was thus 256x easier to brute force than before as a result (i.e. if it’d take a year for 64 bits vs less than 2 days for 56 bits). These kinds of suggestions are often given for “performance” or “compatibility” reasons.
2) There have been many instances where NIST (the standards body with close ties to the NSA) has suggested “improvement” to algorithms in modern times by using shorter keys and simplifying algorithms to encrypt less citing better performance. More security cost performance. Prioritizing performance sends a message.
3) Unless I am very much mistaken, the selection of which PRNG is used in windows is set somewhere deep within the registry. Simply setting a registry key is enough to silently enable that broken PRNG and Bitlocker will continue with the user being none the wiser. Isn’t it safer to use disk encryption software that does not have a broken standard in it just waiting to be exploited?
4) 128 bit keys are quite short by today’s standards. For symmetric keys, I think to be future-proof, it should be 256 bits at least (AES256).
5) Software like TrueCrypt allows the user to select one or more encryption and hashing algorithms. The FAT file system volumes you mention as a detriment, while small and unsuitable for encryption of entire disks, allows for a feature that is not present in BitLocker – a hidden volume inside the encrypted volume that no one without the password can prove exist. This allows for plausible deniability even if you are coerced into unlocking the volume.
6) Truecrypt has passed an external security audit – BitLocker has not.
7) When the Truecrypt suddenly and mysteriously shut down, it recommended BitLocker as an alternative on their site. People who have followed encryption and TrueCrypt in particular, found these statements so ludicrous due to the well known problems and the Truecrypt authors’ past stances, that they immediately started suspecting a Warrant Canary. Maybe they looked too hard but they did find a hidden message: “uti nsa im cu si”, which is bad Latin for “Don’t use TrueCrypt because it is under the control of the NSA”

Given all this, and Microsoft being in bed with the NSA – the official explanation of _NSAKey is laughable, Skype calls were known to become backdoored for the NSA shortly after Microsoft purchased Skype, and a bunch of other factors – I think it’s foolish to trust a Microsoft for securing you against the US government. Sure it’s the ideal product to foil the criminal who steals your laptop, but from someone like Edward Snowden, I would not recommend it for.

Unfortunately, TrueCrypt (or VeraCrypt/CipherShed) isn’t an option in Windows 8. I think the key point here is there are no great options for full disk encryption in modern Windows right now. If you really distrust Microsoft, don’t use Windows.

SkypeIsNotPrivate Anon

If you need a gauge or a compass to navigate what companies are private and which ones aren’t, it’s an easy test. If a company is dependent on the USG or has deep contracts or has deep research or grant agreements or takes their money or subsidies – they won’t be able to be private, at least not for long.

There’s a kind of unspoken rule, a tacit agreement in business making, to provide customer service amenities to the breadwinner. If a company values USG business more than they value the independent consumer buyer your product is not going to be private according to consumer standards. The standards are driven indirectly by the bigger buyer. In MS case, since they have a deep history with DoD etc. We know things are not expected to get much more privacy oriented toward the consumer.

MS really needs to create a separate company segmented to serve emerging privacy ware and consumer privacy markets. Consumer privacy demand deliverables with a promise THIS COMPANY is/won’t be manipulated by the National Security State. A compromise may be necessary. This is just an idea, but I think MS could redeem its privacy clout by purchasing or buying out the Tor Project’s USG development contracts from the Dept of State.

That would make a lot of consumers feel better because the USG’s direct involvement would be divested from Tor and MS would finally get the professional attention on privacy it actually should have always had.

It’s a winning strategy because we know how deplorable the privacy (no pun intended) outlook is for MS and has been due to clandestine concessions made to USG interests. The irony would be that the one project funded by the USG stun-blocked NSA snoops. That gave Applebaum whistleblower status b/c he’s working for the government. I would prefer that Tor project not be a whistleblower while they work for the government. Many would prefer that MS & Tor Project quit the government and help the private sector be private.

MS needs to leave the government sector to be private (but they won’t). Tor needs to leave the government to keep from getting snake bit all of the time and to stop the sword from hanging over their heads (but they haven’t yet). Tor has business history with the USG, so regulators already know what & who they are dealing with. Same with MS.

Consumers trust Tor developers. MS has a locked in consumer structure to deliver to the markets. Both came from Seattle. It would be a boon to the emerging privacy market if they would consider this kind of move.


Micah, thanks for the post. As one of those who protested previously in the comments at your recommendation of Bitlocker, I found this article informative but unconvincing.

You write “With the release of Windows 8, TrueCrypt became painful to use for full-disk encryption. If you’ve bought a PC since 2012, when Windows 8 came out, chances are you can’t use TrueCrypt to encrypt your hard disk without jumping through quite a few hoops. You may need to figure out how to open your boot settings and disable security features to get it working, or format your hard disk to use a different, older system for organizing the disk that’s compatible with TrueCrypt. To put it bluntly, TrueCrypt is Windows XP-era software. As modern PCs and the Windows operating system evolved, TrueCrypt stayed in the past.”

But there are TWO windows versions between XP and 8, namely 7 and vista. And they are still widely used.

I have multiple computers running windows 7, which are all full disk encrypted using veracrypt. In addition, I encrypt particularly sensitive folders as a hidden volume inside this, with a different password. So far my experience with veracrypt has been great, and I recommend it to anyone. And at least for Windows 7 or below, I was unable to find anything in your post to suggest bitlocker would be superior, so thanks for that!

If you use Windows 7, VeraCrypt is an excellent option.

While it still has a lot of users, I’m just not focusing on it very much because it’s a 6-year-old OS and there will be less and less users over time, especially with the release of Windows 10 this summer. I’m mostly looking forward to what disk encryption options will be in the coming years.

Isn’t this all just propaganda and BS on the part of Microsoft? Sense we know how willingly Microsoft has worked with NSA et al even volunteering to help NSA gather and store all Skype calls why would anyone believe what they now say? If they are sincere then they should release the code or at least part of it and relevant algorithms. Nothing Microsoft, or any other transnational corporation, but especially tech transnationals like Microsoft, should be accepted as true or correct without documented evidence.

It is not reassuring that Microsoft are using a “good” random number generator if they will not reveal the algorithm. Any algorithm that depends on a series of mathematical operations is only pseudo-random, and may have a very short period. Take a look at chapter 3 in Knuth (Vol.2 in “The art of computer programming”).

Some day, some clever entrepreneur is going to come up with a key generator that is based on a physical process, something as simple as measuring the current flow through a resistor, and package it in a USB drive that can be plugged in and used to encrypt and decrypt. Until that day comes, all of my private material will remain stored on a computer that has no network card, no modem, and a good line filter between the power supply and the wall outlet. Want to see what ‘s on it? Get a warrant.

Actually, both Dual_EC_DRBG and CTR_DRBG have completely public algorithms. They’re NIST standards.

Not quite correct. Actually the standard for CTR_DRBG is SP800-90A which is the same algorithm on the elliptic curve that Shumow and Ferguson criticized in 2007. It took me about 4 hours to read through your BitLocker part of your piece, fact checking. Among other things, we don’t actually know that the NSA corrupted the algorithm on purpose, and we specifically have no idea what BULLRUN refers to because, contrary to what the Guardian published (the NYT article cited even says so) Snowden didn’t have or gain access to BULLRUN.

I’m not a cryptographer, but I do understand algorithm and the math behind it. I also understand the place where the flaw is, in Appendix A of SP800-90A. It specifically says that other choices for “P” and “Q” can be made and refers to ANSI X9.62 2005 for standards for making sure the values don’t degenerate in the chosen finite field.

People choose to make programming decisions based on convenience, market, and speed that aren’t necessarily the best choices, and ISO standards, in particular, are susceptible to a lot of forces, not the least of which is ego from the corporations and government entities participating, likewise with many of the other standards organizations. This atmosphere of dark conspiracy that surrounds this particular algorithmic flaw runs counter to how code and chips are approved for sale by the private companies that generate them to fit the standards. And most importantly, the very documents you cite on which the Guardian and NYT (and presumably ProPublica) articles are generated show a lot of long term effort at code breaking and a lot of expense (in the budget document) and a lot of anxiety that something that took a lot of work and a lot of time, money, and mathematics would get disclosed. That simply and honestly isn’t a description of a “backdoor” in which NIST violated the same appendix in which it planted it to create “Q”.

Maybe they did plant it that way, but that couldn’t possibly be the core of BULLRUN, and Microsoft is still using the faulty Q in CTR_DRBG unless they’ve told you otherwise. Given the compatibility and computational reasons they gave you for removing the diffuser, there’s probably no way they implemented SP800-90A Appendix A.2.2, which apparently in the same document the NSA had total control over and pushed this backdoor, the same organization published the specs for defeating the same backdoor.

>Actually the standard for CTR_DRBG is SP800-90A which is the same algorithm on the elliptic curve that Shumow and Ferguson criticized in 2007.

This is not true. DUAL_EC_CRBG is one of FOUR algorithms included in NIST SP800-90 and SP800-90a, and the only alleged to have been backdoored. The Shumow and Ferguson paper, which we link, makes no swipes at CTR or any other algo other than DUAL_EC_DRBG. CTR remains in the draft revised SP-800-90a-rev1 that removed DUAL_EC_DRBG in the wake of the Snowden revelations. Links:

While I’m not claiming CTR is NOT backdoored, I am saying you’re wrong to say there’s some kind of stain on it from simply being in NIST SP-800-90 alongside DUAL_EC.

>we don’t actually know that the NSA corrupted the algorithm on purpose, and we specifically have no idea what BULLRUN refers to

I’m not sure the reporting or PRNGs was ever tied to the Bullrun documents, they just appeared in the same stories. The stories talked about Bullrun, and effort to compromise PRNGs, but I don’t see an explicit connection made.

NYT, Guardian, and Pro Publica all published quotes from the documents where the NSA brags about having edited the spec so I’m unclear how you can allege anything happened accidentally.

>Microsoft is still using the faulty Q in CTR_DRBG unless they’ve told you otherwise.

This is intriguing; can you email one of us? [email protected]m and/or [email protected]. PGP keys are under the “staff” section of the site or in the usual keyservers.

My bad I appear to have not been working off the latest version (Rev. 1) of SP800-90A. Nevertheless, at the core of the CTR_DRBG, the reseed method is left optional, and the two options shown in the reference model are both effectively low bit keys (AES_ECB, TDEA). The one I was able to see implemented was AES_ECB. Which leaves open what reseed they are using, and whether they are using some form of discrete logarithm algorithm, which would include the ECC which you mention is still in their distribution.

As for the NSA “bragging”, the NYT article publishes the two docs about having decryption methods (a budget and a notice about the sanctity of BULLRUN), and the other docs published (Guardian) are slideshows. I reiterate that the budget figures and the anxiety about BULLRUN (which we don’t know anything about) are at odds with the notion that the so-called “bragging” is about this particular weakness, which was introduced in 2006, got spotted in 2007, the decryption coup happened in 2010, and the 2013 budget indicates they are still working to enable it. They may or may not have implemented the Q thing, but it isn’t what they’re talking about in 2013, most likely.

I think that’s what I said the first time, I got my better reply wiped out trying to look something up.

I am sorry. Have you noticed that MS will release their next OS version on July 10th. They need some good PR right now. Isn’t MS the company that is scanning pictures and is handing over date very quickly?

I wouldn’t trust them for a minute.

This story did not originate from Microsoft but from our own interest and reporting. As stated in the article, they gave us a “no comment” initially.

Also, I’d hardly call this post good PR for Microsoft. While it alleviates concerns about a random number generator bundled with Windows, it points out that BitLocker has been weakened by the removal of the Elephant diffuser, and points out that Microsoft won’t rule out backdoors as normally defined.

““It has never been the default, and it requires an administrator action to turn it on,” a Microsoft spokesperson told me.”

The story may not have originated from Microsoft’s pen, but the details aren’t from a technical person, they’re from a spokesperson. That means their PR people. Granted you got Schneier to comment, and that’s cool, but you do sound a bit credulous (and Schneier is recommending crypto based on his ‘good feeling about them’, but that’s a whole other concern.)

The reason I suspect Schneier is trusting BestCrypt based on “good feelings” is because for Windows there are no perfect options. The open source options, TrueCrypt, VeraCrypt, CipherShed, and DiskCryptor have bad support in Windows 8, making the only real options proprietary products like BitLocker and BestCrypt, which have good support but aren’t very transparent.

Assuming you want to use Windows 8, what disk encryption do you recommend?

Microsoft is on the verge of manufacturing cheap microcomputing manifests of 9$ motherboards. It would be a massive global boon for PC. It would also bring the entire globe closer to computing the entire planet. While that’s an amazing reachable aim to increase education & communication for humanity; it would also be the largest, most stunning global panopticon in human history.

For Microsoft to persist in presenting inferior web connections, including “privacy” w/ backdoor receiving areas you could drive a Mac truck through is now really a human rights issue. Microsofts version of private is not private. That’s an open “secret”. No matter how much PR they throw at this, it won’t make their technology private until it actually becomes private.

THINK of all of the Microsoft dependent users on this planet right now. They have little to no privacy options. It is time to marry Microsoft to someone who can deliver privacy instead of just talking about it.

Well, if you don’t want to go for the proprietary solutions like Bitlocker or Bestcrypt I would go for Veracrypt. It’s not that hard really: you basically have to disable secureboot in the UEFI/BIOS and then install Windows 8 the MBR way. You can do this by booting your Windows install USB in MBR mode (= select the option that does NOT contain UEFI in it). To be clear: you select this in your UEFI/BIOS at bootup (i.e. boot from USB), not in the Windows installer itself.

I just use Bestcrypt. It may not be fully NSA-proof, but I don’t need to be.

“For Windows there are no perfect options”

But there are *better* options. At the same time if you started parroting that BestCrypt were the “best” option I’d say “maybe a better option”. TrueCrypt is probably the best available option. There are no perfect options. For Windows there absolutely cannot be a perfect option, and that doesn’t even bring in useability factors. I agree it could be made easier for the user to drop back to MBR. You’re unnecessarily biased against WIn7 btw — just because something is 7 years old doesn’t make it ‘bad’. Windows in and of itself — not good. But there’s always going to be people who insist on using it, apparently. I’m obviously not going to waste time addressing this — I’d fail (anyone would fail, short of personal attack on their lives or families requiring it — and of course, people get complacent.)

BTW there are other good reasons to suggest not trusting your closed-source OS provider (or even open-source OS provider) from being your one-stop shop. One is the dreadful password issue. If you have to boot up your computer to get through Customs, for instance, you’re basically also giving them your data anyway. This is probably THE biggest threat that the “normal person” has to deal with. Swapping out drives is a solution. So is reformatting. Side-by-side TrueCrypt installation is probably a “better” option.

But all other things (governmental or otherwise) aside, I’d never call an option that opens you up by booting up and logging in, period, a good option. Especially in countries that define their borders as dozens of miles in, where search and seizure doesn’t require a warrant, and (honestly, for most people) risk of embarrassment or loss of equipment is a primary concern (though this is likely going to expand more and more to include other personal concerns, given the way things are going — it’s easy to be part of an out-group, and what’s “ok” as an opinion changes, sometimes, in a split-second (ask an American what they think about Muslims on Sept 10, 2001 and ask again a day later, as an example.)

It’s better to put in a couple of hours to make yourself a bit more resilient to your own complacency later, anyway. The user is often the biggest threat to his own data. But yeah.

When I said “I’d never call an option that opens you up by booting up and logging in, period, a good option.” it reminded me that I think I’ve addressed something else you suggested before — which is IIRC you or someone else suggesting there’s no real case to be made for separately encrypting /home on Linux. Not entirely accurate. Don’t remember who it was saying that before, though, just that it was during your ‘nsa-proof’ article, and it may not have been you who said it. Better to ditch LVM and/or keep your data partitions totally separate, with separate keys. Windows… does NOT do this, cannot, and will not. Actually none of the Windows options do, as far as I can recall.

(Yes, this means I am suggesting some overcredulousness in sources without evidence — which matters, a LOT, when it comes to crypto especially.)

“I asked Microsoft if the company would be able to comply with unlocking a BitLocker disk, given a legitimate legal request to do so. The spokesperson told me they could not answer that question.”

Surely that means they’re gagged about telling the truth that they can unlock a BitLocker disk? What other explanation could there be for such a response?

Good catch.

Micah… No. Or more specifically how would you know? For that matter how would they? It’s not like there’s just ONE way to compromise encryption. :/

Heading: “They fear BitLocker’s encryption keys are compromised by default. They’re not.”
Explanation: “Microsoft told me that while the backdoored algorithm is included with Windows, it is not used by BitLocker, nor is it used by other parts of the Windows operating system by default. According to Microsoft, the default PRNG for Windows is an algorithm known as CTR_DRBG, not Dual_EC_DRBG, and when BitLocker generates a new key it uses the Windows default.
“It has never been the default, and it requires an administrator action to turn it on,” a Microsoft spokesperson told me.”

Microsoft told you Windows doesn’t use Dual_EC_DRBG, how can this be confirmed? If they are even capable of complying with a “lawful” request to decrypt a hard drive encrypted with BitLocker there has to be a backdrop.

‘A company spokesperson told me that they don’t include backdoors in their products, explaining that they don’t consider building methods to bypass their security in order to comply with legitimate legal requests “backdoors.”’

And I don’t eat sandwiches, though I do eat snacks made of sliced bread and fillings.

Why that’s not a sandwich, that’s just part of a LUNCH!

So true! In fact, if someone wants security he shouldn’t even use Windows.

Still have that IBM? ;)

“You can think of a 128-bit key, the kind used by BitLocker by default, as a random number between 0 and 2^128″

Not to be pedantic (great article), but I think it should be between 2^127 and 2^128. The number “10” for example, isn’t a 128-bit key…

No, it is. There’s are no requirements on leading zeroes.

Actually, “10” is part of the key space, it just has a lot of leading 0s in binary: (000000000….1010).

It’s actually somewhere in between. While the number doesn’t have to be greater than 2^127, there are a set of numbers which are disallowed.

I checked with a few real-live cryptographers. A 128-bit key is a list of 128 bits with no restrictions on leading 0s. If you want to express it as a number, the number is between 0 and 2^128 – 1 (so, not including 2^128). A number between 2^127 and 2^128 isn’t a 128-bit key, but rather a 127-bit key — because you’re limiting it to half the keyspace of a 128-bit key, and every time you add a bit to key length you double the key space.