The Washington Post has a poorly thought through editorial today on the Justice Department’s “CALEA for the Cloud” initiative. That’s the formative proposal to require all Internet services to open back doors to their systems for court‐ordered government surveillance.
“Some privacy advocates and technology experts have sounded alarms,” says the Post, “arguing that such changes would make programs more vulnerable to hackers.”
Those advocates—of privacy and security both—are right. Julian Sanchez recently described here how unknown hackers exploited surveillance software to eavesdrop on high government officials in Greece.
“Some argue that because the vast majority of users are law‐abiding citizens, the government must accept the risk that a few criminals or terrorists may rely on the same secure networks.”
That view is also correct. The many benefits of giving the vast majority of law‐abiding people secure communications outstrips the cost of allowing law‐breakers also to have secure communications.
But the Post editorial goes on, sounding in certainty but exhibiting befuddlement.
The policy question is not difficult: The FBI should be able to quickly obtain court‐approved information, particularly data related to a national security probe. Companies should work with the FBI to determine whether there are safe ways to provide access without inviting unwanted intrusions. In the end, there may not be a way to perfectly protect both interests — and the current state of technology may prove an impenetrable obstacle.
The policy question, which the Post piece begs, is actually very difficult. Would we be better off overall if most or all of the information that traverses the Internet were partially insecure so that the FBI could obtain court‐approved information? What about protocols and communications that aren’t owned or controlled by the business sector—indeed, not controlled by anyone?
The Tahoe‐LAFS secure online storage project, for example—an open‐source project, not controlled by anyone—recently announced its intention not to compromise the security of the system by opening back doors.
The government could require the signatories to the statement to change the code they’re working on, but thousands of others would continue to work with versions of the code that are secure. As long as people are free to write their own code—and that will not change—there is no way to achieve selective government access that is also secure.
The current state of technology, thankfully, is an impenetrable obstacle to compromised security in the interest of government surveillance. The only conclusion here, which happily increases our security and liberty overall, is that everyone should have access to fully secure communications.