This is a really perplexing question for me. I read all of the articles and sat and pondered for a while, and still, no solution is overwhelmingly clear to me.
The government’s primary argument is for national security. They posit that Apple ought to cooperate and sneak a feature in to allow for a simple entry to the locked phone so that they can access data that will further their investigation of a terrorist attack.
On one hand, the government has been doing this for years – collecting data from people and corporations to aid in their investigations. For the most, part people have gone along with it. And for the most part, the government seems to handle the evidence responsibly.
On the other hand, the changing presence of technology in our society has morphed the playing field – it’s not as cut and dry as testifying that you saw someone do something. In order to turn the evidence over to the government, Apple would have to develop a modified operating system that would allow anyone in possession of the operating system to essentially break into any phone.
Of course, that software would likely be kept as secure as possible in Apple’s safe, but one of the biggest worries is that once the precedent is set, the government could run this procedure whenever it wants.
If we trust our government to use that power and the evidence that comes with it responsibly, it seems to me that it is an okay precedent. Just like we’re willing to let the government come search our homes (an invasion of privacy) if they have a search warrant, we ought to be willing to let them read our emails or look at our pictures.
But that’s a big if.
The whole dispute/investigation/case is a bit dramatic, in my opinion. I think this is an important debate to have (privacy vs. security) but framing it as “encryption is the reason the Paris attacks happened” vs. “giving the backdoor to the government is the end of your privacy forever” is overblown.
With regard to the government’s melodrama, regardless of whether or not the terrorists in Paris achieved their end with encryption, there shouldn’t be a reasonable expectation for the intelligence powers of the world to bust all potential attackers. People can be secretive with or without encryption (steganography).
In the truck driver sexual assault case that the government cites (in which an incriminating video on the truck driver’s phone led to a conviction), an investigation stalled by no access to the phone would proceed much in the same way the investigation would have proceeded 30 years ago, when the truck driver wouldn’t have been carrying a smart phone at all. Sure, there are obvious uses for subpoenaing smart phones, and it is vitally important for the government to do its best to keep up with the technology curve, but to rely solely on evidence from smart phones would be unsafe hyper-dependence.
With regard to Apple’s comments, it seems that 100% secure, end-to-end encryption might be too much if it conflicts with the government’s ability to access warranted evidence. Apple may feel compelled to offer customers the finest encryption service in the land, but they should also feel morally obligated to prevent evil acts from happening when it’s in their power.
I’m not suggesting they snoop around people’s data, but if it is determined that the FBI has warranted power to get the info, Apple (and other tech providers) should be able to comply.
I’ve never felt strongly about the “nothing to hide, nothing to fear” argument, but maybe that’s just because I haven’t seen it exploited before…