I tweeted this article about the severe implications of the government's request for a backdoor in Apple's products and included the comment “Imagine the 1st missionary killed in a hostile land, found via an FBI mandated backdoor. This is why Apple is right.” A friend of mine asked me on Facebook why it is so crucial Apple not be forced to create a system that would allow the unlocking of the San Bernardino terror suspect's phone. I want to answer my friend's question by exploring two different parts of the problem.
To understand where this all starts, it starts with Apple creating an encryption system that they did not have the key to unlock. After the revelations about the NSA that Edward Snowden released, Apple created such a system for a very simple reason: it became clear that the government intended to vastly exceed its constitutional surveillance powers and the only way a company like Apple could avoid becoming a collaborator was to remove itself from the key equation so that it genuinely could not access customer data. If a company has the key, the government can demand the key not only to see what a terrorist has on his or her phone, but also for other, less desirable searches like the warrantless, broad data collection the NSA has been doing over the last decade. Worse, when the government utilizes these unconstitutional powers, it imposes gag orders on the companies it interacts with so they cannot even say anything about what is happening.
It bears repeating: while there is broad support for breaking into a terrorist's phone, the only way Apple can legally avoid being made a tool for the government against all of us, not just terrorists, is to create a product that does not have a backdoor. So, Apple did the logical thing: it created a product without any backdoors. Apple is now being asked not just to “unlock” its phone, but to create a new version of its software that has an intentionally broken security system. If it exists, even if it were installed on only this one phone, we will be only a few secret FISA orders away from it being installed on thousands or millions of phones. If an iOS variant that creates a vulnerability exists, the NSA can just contact Apple six months from now and order that same backdoor be included in every iOS device the next time a software update goes out. And, it could gag Apple so that the company could not warn anyone.
Consider how much data the average smart phone has: it knows where we have walked to and driven to, it knows whom we call and write and much of what we say. Most recent phones can listen for us to activate them with a command like “Hey, Siri” or “OK, Google,” which means they essentially listen to everything we do. Add a fitness tracker or smart watch and they know when we sleep, how much exercise we get, even what our vital signs are.
Consider a not so far fetched example: what if, as part of trying to predict crime using Big Data, the government wanted to know where everyone was and if their heart rate had increased, because we assume that someone near a likely crime location with an increased heart rate might be perpetrating a crime? It would be doable with modern smartphones that had government mandated backdoors, but obviously it would shed any modicum of privacy we might still think we have. Think about that for a moment and it becomes clear why we want the data on our phones to be encrypted in a way that only we, the creator of the information, can ever decrypt.
There is even a good, “patriotic,” economic reason to be concerned. If American devices all could have government mandated weakened security on them, it would kill Apple and other American electronics manufacturers overseas. Who wants to buy an Apple product in, say, Germany if it might have been modified by order of the CIA to have that modified software?
Let's say, though, I take an entirely non-cynical approach. I assume the American government always wants to do the right thing and that giving it what it wants here will not lead to any abuses of power on its part. The second issue is the one the article I linked to identifies. If Apple creates a backdoor and activates it even on this one phone, what is going to stop some repressive government, like China's, or even a hostile non-governmental entity like a terrorist group, from locating that security hole and then using it to track down and kill missionaries or pro-democracy forces or even our own, well intentioned government agents working overseas?
Over the years, I have seen a lot of companies create a backdoor for one well intentioned reason or another. Inevitably, a malicious person will discover how to use it and then take full advantage of it. Apple has created an incredibly tightly controlled, secure software distribution channel that creates a system that not even Apple can fully access and then has aggressively patched any discovered holes one could use to hack it in order to avoid its users being maliciously attacked.
That is why I framed my initial link the way I did: a lot of people are probably cheering on the government because no one wants to be on the side of terrorists and, at the superficial level, Apple's foes can try to make it look like the company cares more about a terrorist's privacy than our national security. However, Apple does not care about a terrorist's privacy — Apple realizes that to break the system in order to get into the terrorist's phone will break the system for everyone else, too.
This is as close to a Pandora's box scenario as you can get. If the government prevails in this, we will have no electronic privacy while using “legal” devices. The terrorists, on the other hand, will migrate to open source solutions; sure, those systems might be forced to implement the same governmentally mandated backdoors, but the terrorists will simply remove them and they will enjoy complete privacy while law abiding citizens find themselves under an ever growing police state.
What was it that Ben Franklin said about security and freedom again?