Law Enforcement Freaks Out Over Apple & Google's Decision To Encrypt Phone Info By Default

by Mike Masnick
Techdirt
Sep. 24, 2014

Last week, we noted that it was good news to see both Apple and Google highlight plans to encrypt certain phone information by default on new versions of their mobile operating systems, making that information no longer obtainable by those companies and, by extension, governments and law enforcement showing up with warrants and court orders. Having giant tech companies competing on how well they protect your privacy? That's new... and awesome. Except, of course, if you're law enforcement. In those cases, these announcements are apparently cause for a general freakout about how we're all going to die. From the Wall Street Journal:
One Justice Department official said that if the new systems work as advertised, they will make it harder, if not impossible, to solve some cases. Another said the companies have promised customers "the equivalent of a house that can't be searched, or a car trunk that could never be opened.''

Andrew Weissmann, a former Federal Bureau of Investigation general counsel, called Apple's announcement outrageous, because even a judge's decision that there is probable cause to suspect a crime has been committed won't get Apple to help retrieve potential evidence. Apple is "announcing to criminals, 'use this,' " he said. "You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.''

The level of privacy described by Apple and Google is "wonderful until it's your kid who is kidnapped and being abused, and because of the technology, we can't get to them,'' said Ronald Hosko, who left the FBI earlier this year as the head of its criminal-investigations division. "Who's going to get lost because of this, and we're not going to crack the case?"
That Hosko guy apparently gets around. Here he is freaking out in the Washington Post as well:
Ronald T. Hosko, the former head of the FBI's criminal investigative division, called the move by Apple "problematic," saying it will contribute to the steady decrease of law enforcement's ability to collect key evidence -- to solve crimes and prevent them. The agency long has publicly worried about the "going dark" problem, in which the rising use of encryption across a range of services has undermined government's ability to conduct surveillance, even when it is legally authorized.

"Our ability to act on data that does exist ."‰."‰. is critical to our success," Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.
Think of the children! And the children killed by terrorists! And just be afraid! Of course, this is the usual refrain any time there's more privacy added to products, or when laws are changed to better protect privacy. And it's almost always bogus. I'm reminded of all the fretting and worries by law enforcement types about how "free WiFi" and Tor would mean that criminals could get away with all sorts of stuff. Except, as we've seen, good old fashioned police/detective work can still let them track down criminals. The information on the phone is not the only evidence, and criminals almost always leave other trails of information.

No one has any proactive obligation to make life easier for law enforcement.

Orin Kerr, who regularly writes on privacy, technology and "cybercrime" issues, announced that he was troubled by this move, though he later downgraded his concerns to "more information needed." His initial argument was that since the only thing these moves appeared to do was keep out law enforcement, he couldn't see how it was helpful:
If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It's just a nice piece of paper with a judge's signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple's inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn't stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.

Apple's design change one it is legally authorized to make, to be clear. Apple can't intentionally obstruct justice in a specific case, but it is generally up to Apple to design its operating system as it pleases. So it's lawful on Apple's part. But here's the question to consider: How is the public interest served by a policy that only thwarts lawful search warrants?
His "downgraded" concern comes after many people pointed out that by leaving backdoors in its technology, Apple (and others) are also leaving open security vulnerabilities for others to exploit. He says he was under the impression that the backdoors required physical access to the phones in question, but if there were remote capabilities, perhaps Apple's move is more reasonable.

Perhaps the best response (which covers everything I was going to say before I spotted this) comes from Mark Draughn, who details "the dangerous thinking" by those like Kerr who are concerned about this. He covers the issue above about how any vulnerability left by Apple or Google is a vulnerability open to being exploited, but then makes a further (and more important) point: this isn't about them, it's about us and protecting our privacy:
You know what? I don't give a damn what Apple thinks. Or their general counsel. The data stored on my phone isn't encrypted because Apple wants it encrypted. It's encrypted because I want it encrypted. I chose this phone, and I chose to use an operating system that encrypts my data. The reason Apple can't decrypt my data is because I installed an operating system that doesn't allow them to.

I'm writing this post on a couple of my computers that run versions of Microsoft Windows. Unsurprisingly, Apple can't decrypt the data on these computers either. That this operating system software is from Microsoft rather than Apple is beside the point. The fact is that Apple can't decrypt the data on these computers is because I've chosen to use software that doesn't allow them to. The same would be true if I was posting from my iPhone. That Apple wrote the software doesn't change my decision to encrypt.
Furthermore, he notes that nothing Apple and Google are doing now on phones is any different than tons of software for desktop/laptop computers:


I've been using the encryption features in Microsoft Windows for years, and Microsoft makes it very clear that if I lose the pass code for my data, not even Microsoft can recover it. I created the encryption key, which is only stored on my computer, and I created the password that protects the key, which is only stored in my brain. Anyone that needs data on my computer has to go through me. (Actually, the practical implementation of this system has a few cracks, so it's not quite that secure, but I don't think that affects my argument. Neither does the possibility that the NSA has secretly compromised the algorithm.)

Microsoft is not the only player in Windows encryption. Symantec offers various encryption products, and there are off-brand tools like DiskCryptor and TrueCrypt (if it ever really comes back to life). You could also switch to Linux, which has several distributions that include whole-disk encryption. You can also find software to encrypt individual documents and databases.

In short, he points out, the choice of encrypting our data is ours to make. Apple or Google offering us yet another set of tools to do that sort of encryption is them offering a service that many users value. And shouldn't that be the primary reason why they're doing stuff, rather than benefiting the desires of FUD-spewing law enforcement folks?













All original InformationLiberation articles CC 4.0



About - Privacy Policy