After the horrific and deadly terrorist attack in San Bernardino, California at a work holiday party, the Federal Bureau of Investigation (FBI) began its work: what drove the suspects – a Muslim couple with an infant – to kill innocent civilians? Were they acting at the behest of ISIS?
Investigators found the iPhone of Syed Farook, but couldn’t get past the passcode to examine his contacts. Apple CEO Tim Cook refused an FBI order to create a coded “backdoor.”
Critics called foul, accusing the FBI of looking for a case with which it could set a legal precedent. Cook held firm. Privacy protests erupted.
This week, the FBI announced it used a third party to successfully hack Farook’s smartphone. Obvious questions were raised: why did U.S. authorities try to legally compel Apple to create a backdoor? Who wins in such cases? Are we safer when officials can force digital companies to make hackable products? Or, must personal privacy always trump security?
The Apple vs. FBI Fight Is Over. Now We Can Have an Adult Conversation about Your iPhone.
Peter Weber – The Week
If Apple’s battle with the FBI didn’t focus on strength of encryption or appropriate government surveillance, it also didn’t solve the fight between tech companies and the government. Apple will eventually patch the hole found by the FBI’s hired hackers, tech companies will build more hack-proof devices and more secure methods of communication, and law enforcement agencies will demand access.
There are no easy answers — absolute privacy has real downsides, as does invasive surveillance — but waiting for the next crisis isn’t a good option. And the old way of coming up with viable solutions, bringing the biggest stakeholders together in a room to talk, isn’t working. President Obama has convened plenty of cybersecurity summits with technology powerhouses. But these companies feel burned by the Edward Snowden leaks, which revealed a previously cozy relationship between Silicon Valley and government eavesdroppers.
Nobody Won the Apple-FBI Standoff
Fred Kaplan – Slate
For years, Apple has crafted a brand based on its absolute commitment to security.
Unlike Google and other giants of Silicon Valley, Apple doesn’t sell your data. Its operating systems aren’t licensed to other companies and are, so it’s claimed, much more resistant to hacking. In the past month’s court fight, the FBI was polishing Apple’s luster by claiming that not even the Federal Bureau of Investigation could crack the iPhone’s code without the help of Tim Cook’s engineers.
And now, some hole-in-the-wall private hacking firm has done the undoable—hacked its way in. Apple may have dodged a costly court battle, but its brand has been bruised.
FBI Reversal on iPhone Raises Troubling Questions
Mike Hashimoto – Dallas Morning News
The FBI, remember, insisted it had exhausted all known techniques and still could not bypass Apple encryption to crack into the iPhone 5c used by Syed Farook of the husband-wife terrorist team that shot up an employee celebration in San Bernardino, California….
One phone, in the interest of national security, sounds OK, right? A key that unlocks all phones, regardless of anyone’s assurances, sounds less OK. Far less, especially if you’re an Apple customer who relies on the security of your stored information.
And what are FBI assurances worth? Just this week, after insisting to a federal court that this investigative task was impossible without Apple’s acquiescence, the FBI called a timeout. Shazam, it may have found a way to unlock that Farook phone, after all.
Apple CEO Tim Cook’s own words on the FBI request:
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists….
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
There’s No Such Thing as a Safe Backdoor
Denelle Dixon-Thayer – Time
Too many recent incidents have already undermined that trust and shown that modern communications tools are not as secure as they need to be. That is why we care about the outcome of this case….
And here is a dirty secret: as bad as this could be for Apple, it could be worse for others and their users. Many companies do not have the resources or the engineering know-how to protect the malware they may be forced to create if the government is allowed to do this here. We also need to understand that any backdoors that companies are asked to build are doors that can be opened, not just by law enforcement, but by others that want to do harm roaming the Web.
There is just no “safe” backdoor. You are either safe or you are not.