Earlier this summer, when FBI Director James Comey made his case for backdooring strong encryption, he told us that he wanted to hash out the policy considerations surrounding encryption, law enforcement, and security in public: “Democracies resolve such tensions through robust debate.” This week, we learned that Comey apparently actually meant that he wanted the debate resolved in secret, before a judge known only to the government, by way of a sealed wiretap order.
In a brief article in Monday’s New York Times, the paper confirmed a rumor that has been circulating since the beginning of this summer. Apple has been involved in a dispute with the U.S. Department of Justice regarding iMessage encryption, with the DOJ demanding that Apple give them plaintext copies of iMessages in real time, pursuant to a wiretap order. Because iMessage uses end-to-end encryption, where only the users hold the keys, Apple is unable to comply with such an order unless it compromises its system and implements a backdoor for the U.S. government. This would compromise the security of every iMessage user, something that Apple has steadfastly refused to do.
According to the Times report, the DOJ obtained a sealed order from an unknown federal district court (not the Foreign Intelligence Surveillance Court) ordering Apple to turn over a suspect’s iMessages. After Apple informed the government that it couldn’t comply with the order, the government backed down rather than seek sanctions or an order holding Apple in contempt.
There’s still much we don’t know about this showdown, but it is far from the robust public debate Director Comey promised. It’s no answer to point to the secret or urgent nature of the case that led the DOJ to seek its wiretap order. Even in the context of ultra-sensitive drug or national security investigations, courts can partially unseal dockets in order to inform the public and allow the participation of interested parties (like EFF!). It’s a technique we’ve seen in important cases concerning cell phone location tracking and even NSA surveillance.
The first “Crypto Wars” of the 90s were largely fought in public over specific technical and policy proposals, such as the Clipper Chip. These proposals couldn’t bear scrutiny, and the pro-backdoor crowd lost. Today, rather than accepting that outcome, the government has instead relied on scary anecdotes and vague calls for the companies themselves to engineer a “golden key” that allows “exceptional access” to encrypted communications—a backdoor, in other words. This allows the government to simply raise the specter of widespread encryption as an overall threat to society without subjecting their demands to the very public debate that Director Comey claimed to welcome. Their endgame is worrying: either a secret court order forcing companies to reengineer their systems and never speak about it, or possibly worse, a backroom deal with the government that achieves the same thing.
In addition to the questionable legality of any "exceptional access" requirement, experts in the field of cryptography have raised major concerns as to how such a system would implemented without putting the public at significant risk. Without any solid technical requirements proposed by the DOJ, at this point we can only guess as to how exceptional access might be granted to law enforcement.
Mandated key escrow seems like one real possibility. Key escrow is a system by which a message is encrypted not only to a key belonging to the intended recipient (as with classic public-key cryptography), but also a key held by law enforcement (or potentially one key held by law enforcement and one by the company, with both necessary for decryption—known as a “split key” system). But as a group of computer security experts noted, such a schema would expose the public to a greater risk in two ways. Firstly, mandating the inclusion of escrow capabilities increases the complexity of software, and more lines of code means more opportunities for the inclusion of security vulnerabilities. Secondly, as the expert report continues, the buildout of a centralized data collection hub makes it a salient target for hackers:
Building in exceptional access would substantially increase system complexity. Security researchers inside and outside government agree that complexity is the enemy of security - every new feature can interact with others to create vulnerabilities.
We’ve seen this in the past with the buildout of wiretapping capabilities: in an notorious case in 2004 and 2005, a hundred top officials in the Greek government were illegally surveilled for a period of ten months by parties unknown when Greece implemented a lawful access program that was subsequently compromised. And the U.S. is no exception. As described in the expert report, an audit conducted by the NSA discovered that all telephone switches intended to comply with government demands for wiretapping were found to have security flaws.
The deployment of Clipper chip hardware to intercept communications went wrong within well-funded, centralized, and highly organized telecommunications infrastructure. We have every reason to believe that the problem will only get worse when the onus is put on the plethora of end-to-end encryption providers (often developed by startups, small teams, or independent developers) to build mandatory backdoors in their software.
Finally, government demands to access users private communications raises major policy and regulatory questions. Can any government force a U.S. corporation to fork over user data stored outside the U.S.? Will an app developer in the United States with users in Russia be forced to build in a Russian backdoor? Are companies operating in multiple countries forced to build in separate decryption keys for each of the countries they operate in?
These questions should be discussed in a public forum with public participation before any such system is built out, and not as a result of secret court decisions and under a gag order. We applaud Apple’s resolve in standing firm, and we strongly urge the government to bring this debate out in the open where it belongs.