Apple, FBI, and the Burden of Forensic Methodology

The best overview of the technical aspects of what the FBI is asking Apple to do is at Zdziarski’s blog starting on his February 18 post “Apple, FBI, and the Burden of Forensic Methodology” (linked above) and subsequent follow-up posts. The most frightening section there was:

FBI has asked to do this wirelessly (possibly remotely), which also means transit encryption, validation, certificate revocation, and so on.

I have seen virtually no commentary about this point, which I think is a big, big issue. With previous data extraction cases, Apple took extensive precautions, including requiring investigators to physically transport the iPhone to the Apple facility, and isolating the unit within a faraday cage. In other words, law enforcement had to have physical possession of the device. As many security researchers have pointed out in the past, with physical access it is almost guaranteed that the attacker will find some way to read some or all of the data stored on a device.

With an over-the-air attack tool, anyone who finds a way to bypass the supposed safeguards of the tool could target anyone at any time; they would not need physical access to the device. That makes it significantly easier for an attacker to bypass the security features and unlock the targeted iPhone. And once that happens they can do just about anything they want, including load malware, wipe the device, or do a data dump. With a sufficiently sophisticated tool paired with an over-the-air attack, the person might not even know that their iPhone has been hacked.

We Could Not Look the Survivors in the Eye if We Did Not Follow this Lead

FBI Director James Comey’s post on the oddly-named Lawfare blog:

The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message.

This is disingenuous at best. Given that Tim Cook directly addressed the issue in his open letter, “Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority,” and the New York Times reported that Apple had initially requested that the request be kept under seal, it seems pretty clear that FBI Director Comey is deliberately picking a public fight.

The whole piece is an appeal to emotion, starting from the second sentence:

It is about the victims and justice. Fourteen people were slaughtered and many more had their lives and bodies ruined.

And nothing in the remainder is more honest or less manipulative than the opening lines.

You could see this coming years ago. He and other authorities have just been looking for an excuse, choosing the time and battleground for the confrontation.

One of the FBI’s Major Claims in the iPhone Case Is Fraudulent

The ACLU on the FBI vs Apple encryption backdoor:

If this generally useful security feature is actually no threat to the FBI, why is it painting it in such a scary light that some commentators have even called it a “doomsday mechanism”? The FBI wants us to think that this case is about a single phone, used by a terrorist. But it's a power grab: law enforcement has dozens of other cases where they would love to be able to compel software and hardware providers to build, provide, and vouch for deliberately weakened code. The FBI wants to weaken the ecosystem we all depend on for maintenance of our all-too-vulnerable devices. If they win, future software updates will present users with a troubling dilemma. When we're asked to install a software update, we won’t know whether it was compelled by a government agency (foreign or domestic), or whether it truly represents the best engineering our chosen platform has to offer.

In short, they're asking the public to grant them significant new powers that could put all of our communications infrastructure at risk, and to trust them to not misuse these powers. But they're deliberately misleading the public (and the judiciary) to try to gain these powers. This is not how a trustworthy agency operates. We should not be fooled.

Possibly the most worrying thing about this mess is how blatant the FBI and other law enforcement agencies have been about trying to set this precedent. They almost aren’t even bothering to pretend that it is, indeed, about all phones, not just this one.

And again, use my Worst Enemy Test: If you had to permit your worst enemy access to these powers, would you still support the legislation?

I suspect Director Comey wouldn’t be okay with his political opponents being able to compel Apple or Google to create their own backdoors to bypass the encryption on his phone. Would anyone be happy about President Trump having “sooper sekrit” access to anyone’s information?

Everything Old is New Again

No one seems to have learned anything from history, even recent history. Back in 1993 (a.k.a: The Dark Ages in internet years) the NSA’s baby, the Clipper chip, was meant to provide a back door to any system it was installed on. At the same time, the US government classified strong encryption as a munition, and investigated the creator of PGP, Phil Zimmermann, for violating the export ban.

The Clipper chip program died in just a couple of years, and restrictions on encryption were relaxed in a similarly short time span. Why? Back doors are inherently insecure and technically untenable. The restriction of a technology, like encryption, only works if you can actually keep it from being disseminated. The only reliable way to do that is to cut yourself off from the outside world and impose draconian central-authoritarian rules on your citizens.

Japan kept weapons under the exclusive control of the military by shutting its borders, confiscating weapons, and keeping those with the knowledge to create weapons under central authority. In the early days of firearms, the Japanese were actually more heavily armed than anywhere else, and with the improvements Japanese smiths wrought on the samples traded from the Dutch and Portuguese, their weapons were probably the most technically advanced as well.

In Europe, those measures wouldn’t work because any one nation that tried to hunker down and disarm its populace would place itself at a strategic disadvantage to its neighbors. The end result of isolation and technical control was that Japan was at a severe disadvantage when on the receiving end of some “friendly” gunboat diplomacy from the good ol’ US of A back in the 1800s.

In more modern times, North Korea has done pretty much the same thing over the last 60 years with regard to communications and commerce, with the result that much of its post-industrial technology, particularly its computer technology, is laughably outdated.

If FBI Director Comey gets his way, and Apple is forced to either create a tool for the government to use to unlock devices or compromise its security to provide a back door into the system software, Americans are facing not just the loss of privacy, but a loss of competitiveness in the world market. Communication and device encryption is the backbone of internet commerce.

While it may start with Apple, it won’t end there. Any technology created by American companies will be regarded with suspicion because of the precedent set. Other countries where multinational corporations do business, knowing that a US-based company will be compelled to create skeleton keys for its devices, will make providing them with the same tools a prerequisite for doing business there.

Congratulations, you’ve just given every repressive regime in the world tools to break into anyone’s phones, and not just their citizens’ either. It’s actually worse if the US tries to keep the key to itself because its very existence makes it much more likely that a foreign power or even criminal elements will find a way to steal or co-opt it and use it to break into the phones of US citizens exclusively if it is only installed American versions of the phones. If that happens, the responsible parties would have made the entire US into every nefarious agent’s online ass-bitch.

As we’ve seen with “secret” backdoor technology before, like the TSA keys, it will leak eventually. And when it does, someone will eventually exploit that security weakness to commit a serious crime or act of terrorism. The best way to protect people is to make security better to make it harder for anyone to break in — be it the FBI, terrorists, or criminals. Deliberately weakening security does not benefit either the public or, in the long run, the government.

NSA could crack the San Bernadino shooter’s phone

Clarke added that if he was still at the White House, he would have told FBI Director James Comey to "call Ft. Meade, and the NSA would have solved this problem…Every expert I know believes that NSA can crack this phone." But the FBI wasn't seeking that help, he said, because "they just want the precedent."

Yep, it's pretty obvious that what FBI Director Comey is really going for is the legal precedent, not the information.

Tech giants don’t want Obama to give police access to encrypted phone data

In a Washington Post article from last year:

Tech behemoths including Apple and Google and leading cryptologists are urging President Obama to reject any government proposal that alters the security of smartphones and other communications devices so that law enforcement can view decrypted data…

The letter is signed by three of the five members of a presidential review group appointed by Obama in 2013 to assess technology policies in the wake of leaks by former intelligence contractor Edward Snowden. The signatories urge Obama to follow the group’s unanimous recommendation that the government should “fully support and not undermine efforts to create encryption standards” and not “in any way subvert, undermine, weaken or make vulnerable” commercial software.

I've said before that it's a bad idea to weaken encryption for the sake of law enforcement. This current confrontation between the FBI and Apple has been years in the making.

Amazon’s Anti-Theft Practices

Josh Eidelson and Spencer Soper writing under the oh-so-Gen-Y title, Amazon’s Story Time Is Kind of a Bummer

Security experts say Amazon’s anecdotal warnings are a natural extension of older corporate loss-prevention tactics, such as frisking employees as they leave a store. “There are people who will never steal. There’s a certain percentage of people that will always steal,” says Pat Murphy, the president of LPT Security Consulting. “You’re always trying to influence that middle group by reminding them there is a high probability they will get caught, and if I get caught, these are the consequences.” Murphy, who spent two decades in retail security after leaving the Dallas police force, says that while the psychology of Amazon’s flatscreen messages is familiar, he’s never heard of anything quite like them.

One of my first non-family jobs was working in the mall. In the year or so that I worked there, we had at least two “loss-prevention” seminars where we were threatened with firing and probable jail time if we stole, including anonymized examples of employees who had been caught. We were encouraged to inform on any employee we saw doing anything slightly dodgy. Almost as an adjunct, we were taught how to spot shoplifters. The reason for the scare tactics and their zealousness in prosecuting was that they reportedly lost more from employee theft than from outside theft.

I later worked for the post office and during our training we were told about the postal inspectors, who we were told had a conviction rate higher than 90 percent. The intended message was: if you do something wrong, you will get caught. This was partially because of universal and pervasive surveillance. Post office centers are built with private passageways[1] set with loopholes that allow inspectors to overlook the entire building, especially the sorting floor, from various vantage points. Employees are never sure whether they are under observation or not. At both places, as a condition of employment, you had to sign an agreement that permitted you, your person, your belongings, and your car to be searched at any time, for any reason.

I had these jobs about 20 years ago. Back then, they used posters tacked to a bulletin board. Now, Amazon uses monitors. This is nothing new. Honestly, I found the medievalesque post office observation practices much more creepy than this seems.

  1. Oh, look, there’s a YouTube video of them!  ↩

iPhone Encryption Annoys Authorities

FBI blasts Apple, Google for locking police out of phones

FBI Director James B. Comey sharply criticized Apple and Google on Thursday for developing forms of smartphone encryption so secure that law enforcement officials cannot easily gain access to information stored on the devices — even when they have valid search warrants.

U.S. attorney general criticizes Apple, Google data encryption

Holder said quick access to phone data can help law enforcement officers find and protect victims, such as those targeted by kidnappers and sexual predators.

These reactions from law enforcement officials are excellent endorsements for communication devices if you value privacy and the rule of law over expedience. Arguing for encryption or security that is easier to break is reprehensible and indefensible. For the authoritarian among us, consider this; open doors let anyone in, not just “the proper authorities”. Would you still feel okay about putting in back doors or deliberately weak encryption if a foreign power can use it to spy on Americans or perform corporate espionage?

Considering the extremely lenient approach both the Bush and Obama administrations have taken to overly-broad warrantless searches, the phrase, “even when they have valid search warrants,” is a telling note, albeit one implied by the authors of the Washington Post article and not directly quoted from FBI director, Comey.

Searching an encrypted phone is effectively no different from searching a locked safe or strongbox. Under existing laws and precedents, there must be a warrant that allows the search of the locked container and specifies what they expect to find in it. The container is itself considered property and in principle should not be damaged or destroyed in the search. According to the somewhat ambiguous case law so far, you may not be compelled to supply a combination to a safe or password to a computer that could lead to incriminating evidence.

(As far as I can tell biometric locks, like Touch ID on recent iPhones, are an even more ambiguous case than passwords. Your finger might be regarded as a “key” and you could be compelled to provide access to your finger in order to unlock your phone. That will have to be tested in the courts, I think. For the meantime, if you’re paranoid — or a criminal — you should probably not use Touch ID, and instead have a good passcode.)

People are becoming more cognizant of their behavior with regard to technology and are taking steps to bring it more into line with their “meat space” lives. Up to now, we’ve been sending postcards (email) with our return addresses written on them by effectively sticking them up on a public board (the unencrypted internet) for the next passer-by to deliver to its destination, trusting that no one will read the contents en route or backtrack us to our home address.

Using encryption for email is like putting a letter in an envelope. Locking your phone with a password is like keeping your private documents in a safe.

Now, law enforcement officials are complaining that it’s too hard to catch criminals when people lock their doors so that police can’t enter private homes freely when the owners are not present, use envelopes so no one can read private mail without having to tamper with it, and don’t use party lines so that no one else can listen in on their telephone conversations without a direct effort to do so. In short, law enforcement is losing the easy access to a great deal of information that people were in many cases unaware was being shared very publicly.

To those who argue that we should give up an expectation of privacy and abrogate some of our rights in order to make it easier to find and prosecute criminals, I reply with a quotation:

“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety” - Benjamin Franklin