Apple Helped The Government Extract Data Before. Here’s Why Things Are Different Now

The government says Apple has helped it extract data from iPhones roughly 70 times in the past, but Apple has never done what a court is ordering it do now: create software to crack its own security features for the FBI. An FBiOS, if you will.

For more than two months, FBI technicians have been locked out of Syed Rizwan Farook’s phone, and they're desperate for a key.

Standing in the way of federal law enforcement is an iPhone’s passcode, a string of numbers and letters, perhaps random, perhaps not, that holds behind it the possibility of advancing the investigation into the mass shooting in San Bernardino that left 14 people dead.

With enough tries, and enough time, the FBI could unlock the device, by guessing every possible passcode combination. But another security tool poses an obstacle: after 10 incorrect password attempts, the iPhone wipes itself clean, a self destruct feature that would erase stored data and any hopes of finding new leads or identifying co-conspirators: Who did Farook speak to and where has he been?

With their technical options seemingly exhausted, the federal government has turned to the law, and to previous cases where Apple, the phone’s manufacturer, has helped retrieve information from confiscated devices. But these prior cases are different from the San Bernardino investigation in important ways — in what the government was demanding of Apple, and the existing tech by which the company would satisfy those demands.

In one active case in New York, involving a person suspected of distributing methamphetamine, the government acknowledged that “Apple has repeatedly assisted law enforcement officers in federal criminal cases by extracting data from passcode-locked iPhones.” But what’s crucial here is the phrase “extracting data” and what that means in a practical sense.

Apple can, in fact, pull certain types of data from their phones, even when the device is locked, but those iPhones must be running older versions of their operating system. In the New York case, the device in question runs on iOS 7, which enables Apple to pull information from the phone onto a separate hard drive — all without having to unlock it. In this case, and in previous cases where Apple has complied with a court order targeting an encrypted device, the company was able to access the data even as the device was locked.

The iPhone from San Bernardino, however, runs on iOS 9. And under the new operating system, the previous extraction process no longer works. It’s been designed away. As the encryption technology available to consumers has advanced, the areas of iPhones protected by security tools have grown, leaving less data vulnerable to malicious attacks, and, as it happens, more resistant to government surveillance, even when sanctioned by a judge.

What makes San Bernardino different from New York, and to previous cases, Apple has argued, is that the government is now asking the company not merely to extract information, but to create new software to suppress security features, to unlock the iPhone at issue — a sort of FBiOS.

In the New York case, a federal prosecutor estimated that Apple has previously assisted the government in accessing iPhones 70 times. But the company maintains that previous cases involved deploying existing software, and not manufacturing an encryption backdoor.

Tim Cook, Apple’s CEO, described the government’s demands in stark terms. “Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features,” he wrote in an open letter this week. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”

Government lawyers have asked Apple to assist the FBI in bypassing the 10 passwords wiping feature, and to disable a time delay that kicks in limiting the number of passwords that can be attempted back to back. Then, with the iPhone’s defenses lowered, federal law enforcement could then engage in a “brute force” attack, trying every password combination until the right one is discovered, using a powerful computer to dramatically speed up the process.

According to the Justice Department’s request for the court order, an FBI technician checked the phone's most recent iCloud back-up, and the data erase feature was turned on then, and he suspects that it remains active.

In the DOJ’s latest motion in the San Bernardino case, the government argued that Apple’s non-compliance is nothing more than a publicity stunt “based on its concern for its business model and public brand marketing strategy.”

While the Justice Department claims that FBiOS would be created for the purposes of accessing the San Bernardino iPhone alone, Apple insists it would be possible to use it on others, and indeed there are perhaps hundreds of encrypted devices in the government’s possession, and many beleaguered law enforcements officials eager to break into them.

For Apple, much of Silicon Valley, and civil liberties groups, this legal dispute extends far beyond one device. Security and privacy experts have argued that San Bernardino may set precedent for not only U.S. law enforcement seeking privileged access, but for foreign governments looking to suppress free speech and hobble political dissent. (Apple’s allies say if the U.S. government thinks this precedent wouldn’t be exploited by China and Russia, they’re fooling themselves. Democratic presidential candidate Hillary Clinton made similar remarks at a town hall event Thursday.)

From the perspective of law enforcement, facing violent criminal threats both here and abroad, Apple’s opposition doesn’t represent a commitment to consumer protection so much as it reveals a kind of selfish, post-Snowden brand management. In their eyes, Apple seems fixated on its own self image, on marketing itself as an entrepreneurial pillar of privacy, to the detriment of public safety.

The government also rejects the notion that designing new, security-suppressing software would put a hardship on the company. “While the order in this case requires Apple to provide modified software, modifying an operating system — writing software code — is not an unreasonable burden for a company that writes software code as part of its regular business.”

But where the Justice Department sees grandstanding, Apple sees an unprecedented power grab, a Rubicon crossing of sorts: of government testing the limits of just how far it can push American tech. For Apple and its allies, this is what makes San Bernardino different from New York, and from every other case.

Skip to footer