Gather around young’uns: Back in the antediluvean early 90s, when the digital world was young, a motley group of technologists and privacy advocates fought what are now, somewhat melodramatically, known as the Crypto Wars. There were many distinct battlefields, but the overarching question over which the Crypto Wars were fought was this: Would ordinary citizens be free to protect their communications and private files using strong, truly secure cryptography, or would governments seek to force programmers and computer makers to build in backdoors that would enable any scheme of encryption to be broken by the authorities? Happily for both global privacy and the burgeoning digital economy—which depends critically on strong encryption—the American government, at least, ultimately saw the folly of seeking to control this new technology. Today, you are free to lock up your e-mails, chats, or hard drives without providing the government with a spare key. (The conflict was featured on the front page of Wired Magazine’s second issue, and later detailed in Steven Levy’s lively book Crypto.)
Fast forward to 2014: Apple has announced that the new version of its mobile operating system, iOS, features full disk encryption to protect users’ data, and in contrast to earlier versions of iOS, Apple will not leave itself a backdoor that previously allowed the company to access at least some of the phone owner’s encrypted information. The announcement has been greeted with alarm by cyberlaw professor Orin Kerr, in a series of Washington Post blog entries that seem designed to prove Santayana’s hoary dictum about the perils of ignoring history. Apple, Kerr avers, is playing a “dangerous game” by implementing “a policy that only thwarts lawful search warrants.” Police investigations, he fears, will now be stymied by criminals who refuse to unlock their phones, rendering search warrants to access those devices little more than “a nice piece of paper with a judge’s signature.”
Normally, Kerr’s writing on electronic privacy is marked by an understanding of modern telecommunications technology nearly as impressive as his legal erudition, but in this case, I fear, he has succumbed to an uncharacteristic fit of technopanic. While he writes as though the corporate anarchists at Apple are brazenly thumbing their noses at police with a radical new policy, the truth is more nearly the opposite: It is Apple’s backdoor access that was the abberation, even for Apple. If you encrypt your MacBook’s hard drive with Apple’s FileVault, or your Windows computer with Microsoft’s BitLocker, then unless the user chooses to send either company a backup copy of her encryption key, they can no more unlock those encrypted files than a bookbinder can decipher the private code you employ in your personal diary. Strong encryption is not even new to smartphones: Google’s Android operating system—the world’s most popular mobile platform, running on twice as many devices as iOS—has featured full-device encryption since 2011, and Google has never had backdoor access to those encrypted files. And, of course, there have always been a wide array of third-party apps and services offering users the ability to encrypt their sensitive files and messages, with the promise that nobody else would hold the keys. Does encryption occasionally stymie legitimate law enforcement investigations? Of course—though way, way less often than you might think. The point to remember here, though, is that criminals have had access to backdoor-free encryption for many, many years before Apple announced its new policy without ushering in a terrifying new age of unstoppable criminals and impotent police.
Still, Kerr is right that encryption will now be far easier and more prevalent: Unbreakable encryption is not novel, but the decision to make iOS and Android devices encrypted by default is. Previously, at least, criminals had to be savvy enough to make the choice to use encryption consistently—and many weren’t. Encryption by default, because it protects average crooks as well as sophisticated cybercriminals, is likely to be a practical impediment in many more investigations. Criminals can still be punished for refusing a court order to unlock their devices, but may escape more serious charges that would be provable only with that encrypted evidence. Does this strengthen the case, as Kerr suggests, for legislation requiring device manufacturers to build in backdoors or retain sensitive data? It does not, for several reasons.
First, as Kerr belatedly acknowledges in a follow-up post, there are excellent security reasons not to mandate backdoors. Indeed, had he looked to the original Crypto Wars of the 90s, he would have seen that this was one of the primary reasons similar schemes were almost uniformly rejected by technologists and security experts. More or less by definition, a backdoor for law enforcement is a deliberately introduced security vulnerability, a form of architected breach: It requires a system to be designed to permit access to a user’s data against the user’s wishes, and such a system is necessarily less secure than one designed without such a feature. As computer scientist Matthew Green explains in a recent Slate column (and, with several eminent colleagues, in a longer 2013 paper) it is damn near impossible to create a security vulnerability that can only be exploited by “the good guys.” Activist Eva Galperin puts the point pithily: “Once you build a back door, you rarely get to decide who walks through it.” Even if your noble intention is only to make criminals more vulnerable to police, the unavoidable cost of doing so in practice is making the overwhelming majority of law-abiding users more vulnerable to criminals.
Second, and at the risk of belaboring the obvious, there are lots of governments out there that no freedom-loving person would classify as “the good guys.” Let’s pretend—for the sake of argument, and despite everything the experts tell us—that somehow it were possible to design a backdoor that would open for Apple or Google without being exploitable by hackers and criminals. Even then, it would be awfully myopic to forget that our own government is not the only one that would predictably come to these companies with legal demands. Yahoo, for instance, was roundly denounced by American legislators for coughing up data the Chinese government used to convict poet and dissident Shi Tao, released just last year after nearly a decade in prison. Authoritarian governments, of course, will do their best to prevent truly secure digital technolgies from entering their countries, but they’ll be hard pressed to do so when secure devices are being mass-produced for western markets. An iPhone that Apple can’t unlock when American cops come knocking for good reasons is also an iPhone they can’t unlock when the Chinese govermment comes knocking for bad ones. A backdoor mandate, by contrast, makes life easy for oppressive regimes by guaranteeing that consumer devices are exploitable by default—presenting U.S. companies with a presence in those countries with a horrific choice between enabling repression and endangering their foreign employees.