I tweeted this article about the severe implications of the government's request for a backdoor in Apple's products and included the comment “Imagine the 1st missionary killed in a hostile land, found via an FBI mandated backdoor. This is why Apple is right.” A friend of mine asked me on Facebook why it is so crucial Apple not be forced to create a system that would allow the unlocking of the San Bernardino terror suspect's phone. I want to answer my friend's question by exploring two different parts of the problem.
To understand where this all starts, it starts with Apple creating an encryption system that they did not have the key to unlock. After the revelations about the NSA that Edward Snowden released, Apple created such a system for a very simple reason: it became clear that the government intended to vastly exceed its constitutional surveillance powers and the only way a company like Apple could avoid becoming a collaborator was to remove itself from the key equation so that it genuinely could not access customer data. If a company has the key, the government can demand the key not only to see what a terrorist has on his or her phone, but also for other, less desirable searches like the warrantless, broad data collection the NSA has been doing over the last decade. Worse, when the government utilizes these unconstitutional powers, it imposes gag orders on the companies it interacts with so they cannot even say anything about what is happening.
It bears repeating: while there is broad support for breaking into a terrorist's phone, the only way Apple can legally avoid being made a tool for the government against all of us, not just terrorists, is to create a product that does not have a backdoor. So, Apple did the logical thing: it created a product without any backdoors. Apple is now being asked not just to “unlock” its phone, but to create a new version of its software that has an intentionally broken security system. If it exists, even if it were installed on only this one phone, we will be only a few secret FISA orders away from it being installed on thousands or millions of phones. If an iOS variant that creates a vulnerability exists, the NSA can just contact Apple six months from now and order that same backdoor be included in every iOS device the next time a software update goes out. And, it could gag Apple so that the company could not warn anyone.
A fascinating op-ed in the New York Times from Malte Spitz:
In Germany, whenever the government begins to infringe on individual freedom, society stands up. Given our history, we Germans are not willing to trade in our liberty for potentially better security. Germans have experienced firsthand what happens when the government knows too much about someone. In the past 80 years, Germans have felt the betrayal of neighbors who informed for the Gestapo and the fear that best friends might be potential informants for the Stasi. Homes were tapped. Millions were monitored.
Those last two, short sentences sound familiar?
Three weeks ago, when the news broke about the National Security Agency’s collection of metadata in the United States, I knew exactly what it meant. My records revealed the movements of a single individual; now imagine if you had access to millions of similar data sets. You could easily draw maps, tracing communication and movement. You could see which individuals, families or groups were communicating with one another. You could identify any social group and determine its major actors.
All of this is possible without knowing the specific content of a conversation, just technical information — the sender and recipient, the time and duration of the call and the geolocation data.