Thursday, February 18, 2016

The Encryption Debate

          Previously, I wrote about the laws that are in the process of being reviewed by Congress to ban states from holding the power of enforcing encryption banning. What this would mean, if passed, would be that encryption cannot be banned, unless it is banned at a national level. However, while encryption is not yet banned in any state, it has not stopped the government and the FBI from pressuring big tech giants such as Apple or Google from creating a "backdoor" into their phones, a way that would allow only the FBI or government to get access to locked down data on phones with passwords or fingerprints for security. It has created a large debate over whether these tech companies should bend to the will of the government and potentially sacrifice user data.
          The main reason this pressure has arisen has been due to the terrorist attack on San Bernardino, California. On December 2, 2015, 14 people were killed and 22 were seriously injured in a terrorist attack in San Bernardino, California, which consisted of a mass shooting and an attempted bombing. While the government has many of the suspects in custody, the government has little right to prosecute them without a little more evidence, and have reason to believe that more people behind the shooting and attempted bombing are still out on the streets, roaming free, preparing for more terror. However, they believe that the secrets to finding the other to-be suspects and sending the suspects to jail lies in the encrypted data of the terrorist's iPhone. However, the phone is encrypted, and Apple has denied being able to access the phone's data, since they do not know the password or have the fingerprint. This has gotten the FBI thinking, however, and thus, since neither the FBI or Apple has access to a locked phone, the FBI believes that Apple should put in a backdoor to all future iPhones (it could even work on older iPhones, with a simple software update), allowing any government agency and Apple to access data if a person had been accused of breaking the law or causing massive terror in the future.
           This sounds like a smart move in terms of putting terrorists in jail, but others argue that the government putting pressure on Apple like this is abusing the relationship that the government and businesses have. Some believe that it is too democratic to force a business to change its practices and compromise the data of millions, if not, billions of users, for the benefit of one nation. Advocates of vetoing such a policy would also think that it is violating our rights as protected citizens with protected property, but as I pointed out in my previous blog, the legal game has become much more complex with the introduction of encryption, and the amendments were written in a time where the hardest to find evidence was in a tiny safe hidden in the floor boards that could be cut open, but is no longer applicable when it comes to data on a phone that is inevitably locked down with 1's and 0's.
           What has changed since my last blog post has been the statements of Apple in this continuing debate. In a public letter, signed by Apple CEO Tim Cook and published Tuesday, warns that,

"A backdoor to the iPhone would be something we consider too dangerous to create. The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers -- including tens of millions of American citizens -- from sophisticated hackers and cyber criminals," the letter said. "Such a move would be an unprecedented step, threatening the security of Apple's customers. No reasonable person would find that acceptable."
           Clearly, Apple has made its decision: it has sided with protecting users' privacy from its own government and hackers against protecting the world from terrorist attacks. Well, its clearly not as simple as that, but I side with Apple on this one. While I have nothing to hide, I think that we have not come this far as a society, for one person in this world to exploit billions of devices with one backdoor. I would rather have one terrorist escape, than the whole world's data be compromised because the United States government wanted to read someone's emails.
         Google has also come out in the last 24 hours on Twitter, with Google's CEO Sundar Pichai stating:
"Important post by @tim_cook. Forcing companies to enable hacking could compromise users’ privacy. We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent. Looking forward to a thoughtful and open discussion on this important issue."
           I think both Apple and Google have taken awesome outlooks on this issue. If a government need access to a device, and its for perfectly legal reasons, and the company can do it, so be it, help a court case. But requiring a company to open up millions, if not, billions, of devices, is outrageous and dangerous, and asks us all to give up even more privacy in our lives.

-897 words

No comments:

Post a Comment