Nearly four months after the FBI confiscated a locked iPhone used by the man behind the San Bernardino terrorist attack, investigators have found a way to access the data inside of it.
The Justice Department has refused repeatedly to share details of how it got in, so the method they used remains a mystery, along with the identity of the outside party who showed the FBI how to penetrate the device. But even as the Justice Department has decided to keep these things a secret, at least for now, the White House has recognized that disclosing such vulnerabilities can serve the public interest. Under the government’s own review process, the FBI may be obligated to share the details of the method with Apple.
Michael Daniel, a special assistant to the president and cybersecurity coordinator, laid out the benefits and drawbacks of disclosing vulnerabilities in a 2014 White House blog post. “Too little transparency and citizens can lose faith in their government and institutions,” he wrote, “while exposing too much can make it impossible to collect the intelligence we need to protect the nation.”
Daniel said the government “established a disciplined, rigorous, and high-level decision-making process for vulnerability disclosure.” Among the questions Daniel would ask an agency who wishes to keep a vulnerability secret:
“Does the vulnerability, if left unpatched, impose significant risk?”
“How badly do we need the intelligence we think we can get from exploiting the vulnerability?”
“How likely is it that someone else will discover the vulnerability?”
Daniel’s post gave the public a glimpse into the government’s internal review process for disclosure, a process that privacy experts say lacks transparency and public accountability.
Known as an equities review, the process was designed to balance the competing interests of government agencies after a new vulnerability has been discovered. In some cases, withholding a vulnerability allows the government to conduct counterintelligence or prevent criminal activity, Daniel wrote. But keeping exploits a secret can also leave the American public at risk, with consumer products and computer networks vulnerable to intruders or manipulation.
“This administration takes seriously its commitment to an open and interoperable, secure, and reliable Internet,” Daniel wrote, “and in the majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest.”
In a call with reporters Monday, a law enforcement official declined to comment on the risk that the San Bernardino method may pose to other iPhone owners. He also declined to say if the vulnerability would be subject to the equities review process. But experts outside the government agree that it should, and that the iPhone security breach ought to trigger disclosure, so that the millions of American iPhone owners won’t be exposed to the same vulnerability.
“By keeping this a secret, the FBI is essentially gambling that no one else will independently discover it,” Christopher Soghoian, the principal technologist at the ACLU, told BuzzFeed news. “It’s unlikely that they will remain the only entity that knows about this flaw forever.”
Soghoian also expressed concern about the equities review process itself. The interagency review group, overseen by the president’s National Security Council, he said, is stacked with people who are inclined to keep vulnerabilities secret, in order to use exploits to conduct surveillance, hacking, counterintelligence, and law enforcement. “You have a bunch of foxes deciding how the hen house should be built,” he said.
“There’s not a lot we know about that equities process,” Alan Butler, the senior counsel of the Electronic Privacy Information Center, told BuzzFeed News. The 2014 White House blog post is one of the few public documents about the review process. Another, from January of this year, was made public by the government only after the Electronic Frontier Foundation pursued a public records lawsuit compelling its release.
EPIC’s Butler believes the government is obligated to disclose the San Bernardino iPhone method, especially considering the abundance of data that iPhones contain, in addition to personal information. “In many situations, these phones are used as keys and authenticators for other sensitive material, including critical infrastructure,” Butler said.
Riana Pfefferkorn, the cryptography fellow at Stanford’s Center for Internet and Society, told BuzzFeed News that responsible disclosures can enable companies like Apple to “to alert their users, come up with a fix, and push it out to their users through software updates.” But from the Justice Department’s perspective, a successful security patch can also represent the loss of a law enforcement tool. “The key thing is that Apple can't fix what they don't know about, so the DOJ wouldn't lose this method if they keep it secret,” Pfefferkorn said.
But Pfefferkorn believes the tradeoff works in favor of disclosure. Keeping the method secret would mean leaving everyone’s devices less secure, she said, “so that law enforcement potentially can get access in some instances to some mobile devices used by the tiny percentage of the population who are criminals.”
The law enforcement official on the press call declined to say whether the Justice Department would share details of the secret method with Apple, or whether the method would be used on additional iPhones in different investigations.
In an active New York drug case, for instance, a federal judge rejected the government’s application for a court order that would force Apple to extract information from a confiscated iPhone; the Justice Department has appealed. Court documents from the New York case also revealed that 12 additional cases are pending throughout the U.S., all of which feature the government requesting that Apple pull information from encrypted devices.
Because little is known about the method outside the government, it’s difficult for outside experts to gauge the risk that it may pose to the public, Jay Kaplan, a former NSA analyst and CEO of Synack, a cybersecurity firm, told BuzzFeed News. “There might not be anything for Apple to fix,” he said.
“There's a lot between the conclusion that this particular phone could be accessed by a particular contractor and everyone's phone being vulnerable,” said Joseph DeMarco, a former federal prosecutor who also represented law enforcement groups in support of the Justice Department in the San Bernardino case. “I think there's a lot of assumptions and steps in between that.”
It’s also not clear whether a nondisclosure agreement between the FBI and the unidentified third party would trump the government's obligation to disclose the vulnerability.
Ordinarily, DeMarco told BuzzFeed News, the contract between the government and the third party would govern whether the method could be shared with other parties, including additional law enforcement agencies and Apple.
Following the Snowden revelations, the Obama administration convened a special review group to assess the government’s intelligence agencies. In addition to the review group recommending that the government “not in any way subvert, undermine, weaken, or make vulnerable generally available commercial encryption,” the group also advised the government to generally disclose vulnerabilities.
“In almost all instances, for widely used code, it is in the national interest to eliminate software vulnerabilities rather than to use them for US intelligence collection,” states the 2013 review group report. And that principle holds true in the 2014 White House blog post: “Disclosing vulnerabilities usually makes sense.”
Andrew Crocker, a staff attorney for the EFF, told BuzzFeed News that the requirements to trigger the review process — a newly discovered vulnerability that is not publically known — are clearly present in the San Bernardino case.
What’s unclear, he said, is whether the government will actually follow through with the process, or weigh the public interest case for disclosure fairly. “You can imagine all kinds of workarounds. And we’ve found that the government plays all kinds of word games related to the intelligence context,” he said.
“It’s just a policy adopted by the government; there’s not a lot of transparency or rules around this.”
In the same way that the Apple vs. FBI dispute has made concrete what was once a more abstract disagreement over encryption, Crocker believes this case has also drawn attention to how the government cloaks what ought to be in public view.
“Everyone who is innocent and is walking around with a phone is at the same risk as a target of surveillance or hacking or whatever the government might want to engage in.”
Hamza Shaban is a technology policy reporter for BuzzFeed News and is based in Washington, DC.
Contact Hamza Shaban at Hamza.Shaban@buzzfeed.com.
Got a confidential tip? Submit it here.