Thursday News: The one about Apple and the FBI
The Apple-FBI Fight Isn’t About Privacy vs. Security. Don’t Be Misled – The FBI’s demand that Apple create software that allows law enforcement (in this case, the FBI) into people’s phones marks a very important crossroads in regard to the tradeoff many people believe we must accept between privacy and security. USians may remember that it was exactly this logic — you must sacrifice your civil rights for safety — that the US government used to justify the mass unwarranted surveillance following 9/11. And Apple’s Tim Cook recently revealed that Apple did not even know about the FBI’s “request” until it was a mainstream news item. And while this issue is currently being litigated in US courts, the concern is hardly limited to USians or US law. So I’m going to go ahead an run several stories related to this issue today, rather than spacing them out across a week or so.
This piece from Wired challenges the logic that the government is using, which has historically been very effective (the zero sum privacy v. (inter)national security argument). As Brian Barrett points out (via interviews with experts from the EFF and elsewhere), security is actually not automatically, well, secured when you provide a law enforcement entity with a brand new technology they can use to break into anyone’s phone. Remember Jewel v. NSA? That was pre-Snowden, and it’s still in the courts, with plaintiffs only just given the right to conduct discovery, after a decade of ongoing litigation.
“It would be great if we could make a backdoor that only the FBI could walk through,” says Nate Cardozo, an attorney with the Electronic Frontier Foundation. “But that doesn’t exist. And literally every single mathematician, cryptographer, and computer scientist who’s looked at it has agreed.”
The current Apple case doesn’t involve a backdoor in the traditional sense. The FBI is asking Apple to create a tool that would circumvent a feature that deletes all of the information on the phone after 10 failed password attempts. “We don’t want to break anyone’s encryption or set a master key loose on the land,” Comey wrote. But the authority it would grant the FBI could be used again across a range of scenarios that weaken our privacy, sure, but our security as well. . . .
So far, buoyed by the specter of terrorism and the false duality of privacy and security, the public in general is buying what the FBI is selling. A recent Pew Research pollfound that 51 percent of Americans think Apple “Should unlock the iPhone to assist the ongoing FBI investigation,” while 38 percent say Apple should not. (The rest had no opinion.) Even the survey itself shows how effective the FBI’s messaging has been. Apple is not being asked to unlock an iPhone; it’s being asked to create software that would help the FBI unlock it. After which, there’s every reason to expect Apple and every other tech company will be asked to create more software that could be used to diminish even more civil liberties. – Wired
In Debate Over Apple-FBI Dispute, Gates And Zuckerberg Don’t Agree – Now here’s a surprise (not really): Bill Gates thinks that Apple should comply with the FBI’s order to carve a “backdoor” in the iPhone for law enforcement to use whenever they want, echoing that privacy v. safety logic the FBI wants everyone to uncritically digest. Unfettered. Zuckerberg has taken the opposite view, a view which, ironically, Microsoft as a corporate entity has taken, as well:
“We’re sympathetic with Apple on this one. We believe in encryption,” Zuckerberg said, according to re/code. “I expect it’s not the right thing to try to block that from the mainstream products people want to use. And I think it’s not going to be the right regulatory or economic policy to put in place.”
Gates’ views set him apart from the public stance his company has taken. As part of the Reform Government Surveillance coalition — a group of technology giants that also includes Apple and Facebook — Microsoft joined a statement that acknowledges the challenges and goals of law enforcement, and also states, “But technology companies should not be required to build in backdoors to the technologies that keep their users’ information secure.” – NPR
I got hacked mid-air while writing an Apple-FBI story – As Steven Petrow was working on a story about Apple and the FBI, especially about whether the issue would affect all of us “average” people, a fellow passenger used the public wifi on the plane to hack his email and read all of his messages. The guy confessed this to Petrow after the flight, and it brought into sharp focus the dangers with lowering the bar for security on not only the iPhone, but every other technology that would inevitably be subject to any kind of “backdoor” entrance. Note the list of measures to protect your personal privacy and security at the end of the story (including links to Congress for USians who want to express their support for Apple, because Congress has legislative authority here, which they have not yet exercised).
ETA: This morning, USA Today posted another piece about the obvious dangers of working on public wifi, especially with an unencrypted email account. Also, check out Sunita’s excellent discussion of apps and productivity (wherein she reminds us that VPN is essential these days).
One of my emails was pretty explicit about the focus of my story and I had emailed Bruce Schneier, a security expert who had previously written in the Washington Post about this very issue.
“The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises,” Schneier wrote.
The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized. The FBI may be targeting the iPhone of the San Bernardino shooter, but its actions imperil us all.” – USA Today
Apple Is Said to Be Trying to Make It Harder to Hack iPhones – Yes, Apple is already working to boost its security, and once they accomplish that, the cycle that is currently playing out in the courts will repeat and repeat. Many have observed the obvious: that Congress can and should act. However, what Congress does is in part dependent on what they think their constituents want (especially if we’re in reach of an election year). I don’t know if anyone believes that Apple is motivated by a principled desire to protect consumer information as much as they are appalled at being asked to change their technology to please (and give potentially free-for-all access to) the government, but no matter what the primary motivation, the current situation has highlighted a need to boost the kind of security that protects people from unwarranted surveillance and other privacy violations, and if Apple is smart, they will lead the charge toward that important goal.
In many ways, Apple’s response continues a trend that has persisted in Silicon Valley since Mr. Snowden’s revelations. Yahoo, for instance, left its email service unencrypted for years. After Mr. Snowden revealed the National Security Agency surveillance, the company quickly announced plans to encrypt email. Google similarly moved to fix a vulnerability that the government was using to hack into company data centers.
Apple’s showdown with the Justice Department is different in one important way. Now that the government has tried to force Apple to hack its own code, security officials say, the company must view itself as the vulnerability.
“This is the first time that Apple has been included in their own threat model,” Mr. Zdziarski said. “I don’t think Apple ever considered becoming a compelled arm of the government.” – New York Times