The FBI wants Apple’s help to investigate a terrorist attack and has seized the iPhone used by shooter Syed Rizwan Farook.
Due to the iOS encryption, the FBI has not been able to unlock the iPhone 5C Farook owned. It has requested, ‘technical assistance,’ in a court order, asking Apple to turn off a specific feature so that they can try to brute force the key. Is it’s appropriate to force Apple to disable a key feature that is designed to protect someone’s privacy?
Tech Dirt reports So… have you heard the story about how a magistrate judge in California has ordered Apple to help the FBI disable encryption on the iPhone of one of the San Bernardino shooters? You may have because it’s showing up everywhere. Here’s NBC News reporting on it:
A federal judge on Tuesday ordered Apple to give investigators access to encrypted data on the iPhone used by one of the San Bernardino shooters, assistance the computer giant “declined to provide voluntarily,” according to court papers.
In a 40-page filing, the U.S. Attorney’s Office in Los Angeles argued that it needed Apple to help it find the password and access “relevant, critical … data” on the locked cellphone of Syed Farook, who with his wife Tashfeen Malik murdered 14 people in San Bernardino, California on December 2.
And you’d be forgiven for believing that the court has now ordered Apple to do the impossible. After all, for well over a year, the DOJ has been arguing that the All Writs Act of 1789 can be used to force Apple to help unlock encrypted phones. And that’s an argument it has continued to make in multiple cases.
Many people are now mocking this ruling, pointing out that with end-to-end encryption it’s actually impossible for Apple to do very much to help the FBI, which makes the order seem ridiculous. But that’s because much of the reporting on this story appears to be wrong. Ellen Nakashima, at the Washington Post, has a more detailed report that notes that Apple is actually required to do something a little different:
The order does not ask Apple to break the phone’s encryption, but rather to disable the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That way, the government can try to crack the password using “brute force” — attempting tens of millions of combinations without risking the deletion of the data.
The order, signed by a magistrate judge in Los Angeles, comes a week after FBI Director James B. Comey told Congress that the bureau has not been able to open one of the killers’ phones. “It has been two months now, and we are still working on it,” he said.
In other words, the order does not tell Apple to crack the encryption when Apple does not have the key. Rather, it is asking Apple to turn off a specific feature so that the FBI can try to brute force the key — and we can still argue over whether or not it’s appropriate to force Apple to disable a key feature that is designed to protect someone’s privacy. It also raises questions about whether or not Apple can just turn off that feature or if it will have to do development work to obey the court’s order. In fact, the same report notes that there is no way for Apple to actually do this:
According to industry officials, Apple cannot unilaterally dismantle or override the 10-tries-and-wipe feature. Only the user or person who controls the phone’s settings can do so. The company could theoretically write new software to bypass the feature, but likely would see that as a “backdoor” or a weakening of device security and would resist it, said the officials, who spoke on the condition of anonymity to discuss a sensitive matter.
So you could argue that this is effectively the same thing as asking Apple to break the encryption, since it (apparently) has no direct access to turning off that feature. However, the specifics do matter — and most of the kneejerk responses to the order (and the reporting on it) are suggesting something very different than what the court order seems to say.
I think it’s still perfectly reasonable to argue that this order is highly problematic, and not legally sound. However, it is still quite different than what most are claiming. It also seems like something that could be quite dangerous. Apple is being pressured to write code that undermines an important security feature, and will probably have little time to debug or test it overall, meaning that this feature it is being ordered to build will almost certainly put more users at risk.
Update: Okay, we’ve got the full order and it is, indeed, troubling. Here’s the key part:
Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.
Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.
If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.
The order also sets out that:
To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.
I would imagine that Apple will be taking the court up on that…
Read full Court Order here:
Apple’s Tim Cook Posts, ‘A Message To Our Customers’