Of course, at this point, we don't even know if the Paris attackers used encryption. There's speculation
they did, because reports suggest that no intelligence agency has found any traffic by them. But right now it's just that: speculation.
Still, even if the perpetrators of Friday's attack did indeed use encryption to hide their plans, there are good reasons why creating a system of so-called exceptional access for governments is not just impractical, but risky.
Cryptography has many important functions beyond the obvious one of securing our communications, including authenticating data. And the security of all modern cryptosystems rests entirely in keeping a key -- a very long number -- absolutely secret. (It's a myth, incidentally, that the design must be kept secret). Generally speaking, every message uses a different key, though there are long-lived keys for things like websites. Such keys are used by virtually all of us, every day -- the password or PIN you use to unlock your laptop or phone, for example, serves as a cryptographic key.
It turns out, though, that cryptosystems are really, really hard to get right. For example, one of the earliest modern designs, published in 1978, had a flaw that went unnoticed until 1996. In modern notation, it was only three lines long -- three lines, yet it took 18 years and automated tools to find a flaw that in retrospect was blindingly obvious. In addition, cryptosystems are also delicate, meaning even small, seemingly innocuous changes can utterly ruin one.
Such examples should underscore why exceptional access is so dangerous. These mechanisms, which are so hard to get right in the first place, have been designed to protect keys. To provide exceptional access, cryptographers would need to design new ones that protect keys from everyone -- except when they're supposed to give out a secure extra copy of that key. Yet given how difficult these things are to design correctly, the odds of it being done properly, by every one of the 101 apps you have on your phone, are very, very low.
And even if the cryptographers do their job, the programmers can get it wrong. If the exceptional access software is buggy -- and almost every time a website won't load or an app crashes or you have to reboot your computer, it's due to buggy software, so you can see what an issue it is -- the secret key can leak out that way. Even minor errors can result in it being shared insecurely.
Alternate designs, where some intermediate party can decrypt the content, suffer from a different but equally serious flaw: If that party is hacked, all messages are exposed.
But let's assume that against all the odds, the cryptographers and programmers get it right. Even then, there are still potential problems, especially when it comes to working with other countries. For example, how does the U.S. government handle exceptional access requests from other countries?
Britain seems an easy case, but what about France? It wasn't all that long ago that the French were being accused of spying on American corporations
. China? Do they want to fight terrorists or dissidents? Or are they spying on Americans' traffic? Besides, the U.S. government has condemned
China's demands for its own exceptional access mechanisms. Russia? U.S. government officials have claimed
Russia regularly engages in commercial espionage against U.S. companies.
Then there is organized crime. A former prosecutor I knew told me she was terrified of exceptional access, warning that drug gangs were powerful enough and vicious enough that they could steal keys, simply by threatening or bribing the custodians.
Ultimately, though, it won't matter whether we perfect exceptional access for law-abiding citizens because the terrorists we're really interested in monitoring won't use software with such mechanisms built in. ISIS publications have already published warnings about apps they consider insecure, and they would certainly add any monitored apps to their list or get their code elsewhere. Remember, the United States has no monopoly on programmers.
Even if every country in the world had similarly strict requirements for encryption and access, it still wouldn't solve the problem of terrorists and others developing their own secure communications apps. Indeed, there's evidence they're already doing so. And failing this, they could just resort to couriers, as bin Laden did, or even in-person meetings -- massacres can be planned over the Internet, but they can also be concocted in training camps in Syria or an apartment in Paris.
It would be nice if we could safely and effectively change our cryptography to let us spy on the bad guys. Unfortunately, we can't. So if we insist on systems that allow exceptional access, we end up weakening our own security without enhancing our ability to monitor them. And in the process, we may just make it easier for terrorists to exploit weakened cryptosystems -- and do us more harm in the process.