(A shorter version of this post originally appeared on Vice.com. It focuses on how regulating backdoors in cryptography will diminish users’ security; for more information on the panoply of other problems cryptography regulation raises, check out our post on the Nine Epic Failures of Regulating Cryptography.)
Since Apple first announced three weeks ago that it was expanding the scope of what types of data would be encrypted on devices running iOS 8, law enforcement has been ablaze with indignation. When Google followed suit and announced that Android L would also come with encryption on by default, it only added fuel to the fire.
All sorts of law enforcement officials have angrily decried Apple and Google’s decisions, using all sorts of arguments against the idea of default encryption (including that old chestnut, the “think of the children” argument). One former DHS and NSA official even suggested that because China might forbid Apple from selling a device with encryption by default, the US should sink to the same level and forbid Apple from doing so here, in some sort of strange privacy race to the bottom.1 The common thread amongst all of this hysteria is that encryption will put vital evidence outside of the reach of law enforcement.
But the fact that Apple will no longer be in a position to retrieve a device’s contents on behalf of law enforcement is only a side effect. Apple’s decision, first and foremost, is about protecting the security of its customers. Before this change, if your iPhone was stolen or lost a criminal could break into it with relative ease, accessing your private information using the same backdoor as law enforcement. Now that Apple has sealed that backdoor, you no longer have to worry. In an era when our mobile devices contain incredibly private information, these companies have listened to their customers. They have made a sound engineering decision to make mobile security as strong as they know how, by bringing it in line with laptop and desktop security.
Nothing in this change will stop law enforcement from seeking a warrant for the contents of an encrypted phone, just as they can seek a warrant for the contents of an encrypted laptop or desktop.2 Apple’s decision is simply setting the privacy standard for mobile devices in the same place as non-mobile ones. It’s only a fluke that it hasn’t been the default all along.
It’s also important to note that the amount of information available to law enforcement about someone’s electronic communications, movements, transactions, and relationships is staggering, even if they never analyze a suspect's mobile device. Law enforcement can still seek a phone company’s calls records about a suspect, any text messages stored by the phone company, and a suspect’s email accounts or any other information stored in the cloud–which for most of us these days is a lot, as Hollywood stars recently learned. While EFF thinks that some of those investigative tools have insufficient protections and go too far, turning on encryption by default on devices hasn’t changed any of this.
Unfortunately, that hasn’t stopped law enforcement from twisting the nature of Apple’s announcement in order to convince the public that encryption on mobile devices will bring about a doomsday scenario of criminals using “technological fortresses” to hide from the law. Sadly, some of the public seems to be buying this propaganda. Just last Friday, the Washington Post’s Editorial Board published an Op-Ed calling for Apple and Google to use “their wizardry” to “invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.”
Many on social media found the Post’s suggestion that sufficiently advanced technology cryptography was equivalent to magic very amusing. We at EFF had another reason to find it amusing: in 1996 we ran a “golden key” campaign against the government’s demand for back doors into strong cryptography, complete with a GIF file (at left) for websites to post in solidarity against back doors.
All joking aside, while the Washington Post’s Editorial Board may think technologists can bend cryptography to their every whim, unfortunately it just isn’t so. Cryptography is all about math, and math is made up of fundamental laws that nobody, not even the geniuses at Apple and Google, can break. One of those laws is that any key, even a golden one, can be stolen by ne’er-do-wells. Simply put, there is no such thing as a key that only law enforcement can use—any universal key creates a new backdoor that becomes a target for criminals, industrial spies, or foreign adversaries. Since everyone from the Post’s Editorial Board to the current Attorney General seems not to understand this basic technical fact, let’s emphasize it again:
There is no way to put in a backdoor or magic key for law enforcement that malevolent actors won’t also be able to abuse.
So the next time you hear a law enforcement official angrily demand that Apple and Google put backdoors back into their products, remember what they’re really demanding: that everyone’s security be sacrificed in order to make their job marginally easier. Given that decreased security is only one of the nine problems raised by the prospect of cryptography regulation, you should ask yourself: is that trade worth making? We certainly don’t think so, and we applaud Apple and Google for standing up for their customers’ security even if law enforcement doesn’t like it.
- 1. To see a former high-ranking American security official claim the US should match China in terms of restricting the use of privacy-enhancing technology is disconcerting, to put it mildly.
- 2. Whether or not a person can be required to unlock the device is a complicated question, but one tightly intertwined with the basic rights of due process and against self-incrimination, which ought to be carefully considered by a court in the individual context of each situation.