Such a move would signal a race to the bottom of the slippery slope that has haunted privacy advocates: A world where companies can be forced to sign code developed by the government to facilitate surveillance. In this case, that means software to brute force a passcode, but could as easily apply to remote exploits targeting any networked device that relies on developer credentials to authenticate trusted updates. Which is to say, nearly any modern networked device. It entails, quite literally, handing the government the keys to the kingdom.
What’s particularly worrying is that, while this approach is massively more troubling from a security perspective than funneling such requests through the company itself on a case‐by‐case basis, it would likely stand on a less shaky legal foundation.
Apple’s arguments throughout this case have stressed the unprecedented nature of the FBI’s attempt to conscript the firm’s engineers, noting that the All‐Writs Act invoked by the government was meant to enable only the particular types of orders familiar from common law, not grant an all‐purpose power to “order private parties to do virtually anything the Justice Department and FBI can dream up.” The trouble is, an order to turn over information in the “possession custody or control” of a private party is just such a traditional order. Such demands are routinely made, for instance, via a subpoena duces tecum requiring a person or company to produce documents.
It’s likely that Apple’s developer keys are stored in a Hardware Security Module that would make it difficult or impossible to produce a copy of their firmware signing key directly to the government. But that might not be much legal help. In a separate iPhone unlocking case in New York, magistrate judge James Ornstein recently rejected the government’s argument that a previous All‐Writs Act case, New York Telephone Co., required Apple’s compliance. In that case, Ornstein noted, the government’s