Topic: Telecom, Internet & Information Policy

Congress’s Archaic Information Practices

There have been more than 2,700 bills introduced so far in the current Congress. That’s more than 30 bills per day, every day this year, weekends included. Ordinary Americans have a hard time keeping up, of course. Congress does, too.

The controversy around the anti-sex-trafficking bill in the Senate last week illustrates this well. Debate around the formerly non-controversial bill fell into disarray when Democrats discovered language in the bill that would apply the Hyde Amendment to fines collected and disbursed by the government. (The Hyde Amendment bars government spending on abortion. Democrats argue that it has only applied in the past to appropriated funds, not disbursement of fines.)

How is it that it took until late March for Democrats to discover controversial language in a bill that was introduced in January?

Well, Congress is awash in archaic practices. For one, bills are written in “cut and bite” style—change this line, change that word, change another—rather than in a form that lays out what the law would look like if the bill were passed. That makes bills unreadable—a situation Rep. Justin Amash (R-MI) has sought to remedy.

“BitLicense” Foolishness

When New York’s Superintendent of Financial Services first encountered Bitcoin, he evidently thought it was a way to build his reputation as a hangin’ superintendent of financial services. (Doesn’t quite roll off the tongue like “hangin’ judge,” does it…) He sent subpoenas to everyone in the Bitcoin world and went on TV talking about “narcoterrorists.” That was foolishness.

Unfortunately, he also hatched the idea of creating a thing called a “BitLicense” for firms wanting to provide Bitcoin-based financial services in New York. That program is now hanging like an albatross around his neck.

I know nothing of the details, but a couple of decades in public policy make the probable outlines of what happened pretty clear. The press seized on the “BitLicense” idea. Lobbyists and business people came around to fawn over the “BitLicense” idea with Superintendent Lawsky, each hoping not to get cut too deeply by its inartful sharp edges. And Lawsky, having come around to favoring Bitcoin (it’s fairly evident from his speeches) found himself committed to coming up with this “BitLicense” thing.

When the first draft came out in July of last year, it was pretty universally panned. The Bitcoin community savaged it. Bitcoin businesses said they would not do business in New York. The idea of a second round of proposal and comment was quickly on offer.

But the second draft isn’t that much better. When comments close at the end of this week (how to comment), the “BitLicense” will again have received strong criticism. There’s always that contingent whose stock in trade is always—always—to play ball. And to others the “BitLicense” saga has gotten boring…

But the outcome is very probably set. In order to avoid backtracking, which would look foolish, the Department of Financial Services will probably continue forward on the errant path of creating a peculiar special license for Bitcoin-based financial services providers in New York.

The Fatal Conceit of the “Right to be Forgotten”

Intelligence Squared hosted a lively debate last week over the so-called “Right to be Forgotten” embraced by European courts—which, as tech executive Andrew McLaughlin aptly noted, would be more honestly described as a “right to force others to forget.”  A primary consequence of this “right” thus far has been that citizens are entitled to demand that search engines like Google censor the results that are returned for a search on the person’s name, provided those results are “inadequate, irrelevant, or no longer relevant.”  In other words, if you’re unhappy that an unflattering item—such as a news story—shows up as a prominent result for your name, you can declare it “irrelevant” even if entirely truthful and ask Google to stop showing it as a result for such searches, with ultimate recourse to the courts if the company refuses.  Within two months of the ruling establishing the “right,” the company received more than 70,000 such requests.

Hearteningly, the opponents of importing this “right” to the United States won the debate by a large margin, but it occurred to me that one absolutely essential reason for rejecting this kind of censorship process was only indirectly and obliquely invoked.  As even the defenders of the Right to be Forgotten conceded, it would be inappropriate to allow a person to suppress search results that were of some legitimate public value: Search engines are obligated to honor suppression requests only when linking some piece of truthful information to a person’s name would be embarrassing or harmful to that person without some compensating benefit to those who would recieve the information.  Frequent comparison was made to the familiar legal standards that have been applied to newspapers publishing (lawfully obtained) private information about non-public figures. In those cases, of course, the person seeking to suppress the information is typically opposed in court by the entity publishing the information—such as a newspaper—which is at least in a position to articulate why it believes there is some public interest in that information at the time of publication. 

New Hampshire Ends Brief Flirtation with National ID Compliance

When the REAL ID Act passed in 2005, Senator Joe Lieberman (D-CT), no civil libertarian, called the national ID law “unworkable” for good reason. It seeks to herd all Americans into a national ID system by coercing states into issuing drivers licenses (and sharing information about their drivers) according to complex federal standards.

The hook REAL ID uses in seeking to dragoon states into compliance is the threat that TSA agents will refuse IDs from non-complying states at our nation’s airports. The threat is an empty one. Consistently over years, every time a DHS-created compliance deadline has come around, state leaders with spines have backed the Department of Homeland Security down. I detailed the years-long saga of pushed-back deadlines last year in the Cato Policy Analysis, “REAL ID: A State-by-State Update.”

DHS has stopped publishing deadline changes in the Federal Register–perhaps the endless retreats were getting embarrassing–and now it has simply said on its website that TSA enforcement will begin sometime in 2016. But it’s evidently back-channeling threats to state officials. Those folks–unaware that REAL ID doesn’t work, and disinterested in the allocation of state and federal power–are lobbying their state legislatures to get on board with the national ID program.

New York Proposes Special Bitcoin Regulation, But Won’t Say Why

Yesterday, the New York Department of Financial Services (NYDFS) issued the second draft of its “BitLicense” proposal, a special, technology-specific regulation for digital currencies like Bitcoin. For a second time, the NYDFS claims to have a strong rationale for such regulation, but it has not revealed its rationale to the public, even though it is required to do so by New York’s Freedom of Information Law.

If you’re just joining the “BitLicense” saga, the NYDFS welcomed Bitcoin in August 2013 by subpoenaing every important person in the Bitcoin world. A few months later, New York’s Superintendent of Financial Services announced his plan for a special “BitLicense,” which would be required of anyone wanting to provide Bitcoin-based services in New York.

About a year later, Superintendent Lawsky released the first draft of the “BitLicense” proposal, to strongly negative reviews from the Bitcoin community. It didn’t help that after a year’s work the NYDFS offered the statutory minimum of 45 days to comment. Relenting to public demand, the NYDFS extended the comment period.

In announcing the regulation, the NYDFS cited “extensive research and analysis” that it said justifies placing unique regulatory burdens on Bitcoin businesses. On behalf of the Bitcoin Foundation, yours truly asked to see that “extensive research and analysis” under New York’s Freedom of Information Law. The agency quickly promised timely access, but in early September last year it reversed itself and said that it may not release its research until December.

What NSA Director Mike Rogers Doesn’t Get About Encryption

At a  New America Foundation conference on cybersecurity Monday, NSA Director Mike Rogers gave an interview that—despite his best efforts to deal exclusively in uninformative platitudes—did produce a few lively moments. The most interesting of these came when techies in the audience—security guru Bruce Schneier and Yahoo’s chief information security officer Alex Stamos—challenged Rogers’ endorsement of a “legal framework” for requiring device manufacturers and telecommunications service providers to give the government backdoor access to their users’ encrypted communications. (Rogers repeatedly objected to the term “backdoor” on the grounds that it “sounds shady”—but that is quite clearly the correct technical term for what he’s seeking.) Rogers’ exchange with Stamos, transcribed by John Reed of Just Security, is particularly illuminating:

Alex Stamos (AS): “Thank you, Admiral. My name is Alex Stamos, I’m the CISO for Yahoo!. … So it sounds like you agree with Director Comey that we should be building defects into the encryption in our products so that the US government can decrypt…

Mike Rogers (MR): That would be your characterization. [laughing]

AS: No, I think Bruce Schneier and Ed Felton and all of the best public cryptographers in the world would agree that you can’t really build backdoors in crypto. That it’s like drilling a hole in the windshield.

MR: I’ve got a lot of world-class cryptographers at the National Security Agency.

AS: I’ve talked to some of those folks and some of them agree too, but…

MR: Oh, we agree that we don’t accept each others’ premise. [laughing]

AS: We’ll agree to disagree on that. So, if we’re going to build defects/backdoors or golden master keys for the US government, do you believe we should do so — we have about 1.3 billion users around the world — should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government? Which of those countries should we give backdoors to?

MR: So, I’m not gonna… I mean, the way you framed the question isn’t designed to elicit a response.

AS: Well, do you believe we should build backdoors for other countries?

MR: My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this.

AS: So you do believe then, that we should build those for other countries if they pass laws?

MR: I think we can work our way through this.

AS: I’m sure the Chinese and Russians are going to have the same opinion.

MR: I said I think we can work through this.

I’ve written previously about why backdoor mandates are a horrible, horrible idea—and Stamos hits on some of the reasons I’ve pointed to in his question.   What’s most obviously disturbing here is that the head of the NSA didn’t even seem to have a bad response prepared to such an obvious objection—he has no serious response at all. China and Russia may not be able to force American firms like Google and Apple to redesign their products to be more spy-friendly, but if the American government does their dirty work for them with some form of legal backdoor mandate, those firms will be hard pressed to resist demands from repressive regimes to hand over the keys. Rogers’ unreflective response seems like a symptom of what a senior intelligence official once described to me as the “tyranny of the inbox”: A mindset so myopically focused on solving one’s own immediate practical problems that the bigger picture—the dangerous long-term consequences of the easiest or most obvious quick fix solution—are barely considered.

How the NSA Stole the Keys to Your Phone

A blockbuster story at The Intercept Thursday revealed that a joint team of hackers from the National Security Agency and its British counterpart, the Government Communications Headquarters (GCHQ), broke into the systems of one of the world’s largest manufacturers of cell phone SIM cards in order to steal the encryption keys that secure wireless communications for hundreds of mobile carriers—including companies like AT&T, T-Mobile, Verizon, and Sprint.  To effect the heist, the agencies targeted employees of the Dutch company Gemalto, scouring e-mails and Facebook messages for information that would enable them to compromise the SIM manufacturer’s networks in order to make surreptitious copies of the keys before they were transmitted to the carriers. Many aspects of this ought to be extremely disturbing.

First, this is a concrete reminder that, as former NSA director Michael Hayden recently acknowledged, intelligence agencies don’t spy on “bad people”; they spy on “interesting people.”  In this case, they spied extensively on law-abiding technicians employed by a law-abiding foreign corporation, then hacked that corporation in apparent  violation of Dutch law. We know this was hardly a unique case—one NSA hacker boasted in Snowden documents diclosed nearly a year ago about “hunting sysadmins”—but it seems particularly poetic coming on the heels of the recent Sony hack, properly condemned by the U.S. government.  Dutch legislators quoted in the story are outraged, as well they should be.  Peaceful private citizens and companies in allied nations, engaged in no wrongdoing, should not have to worry that the United States is trying to break into their computers.

Second, indiscriminate theft of mobile encryption keys bypasses one of the few checks on government surveillance by enabling wiretaps without the assistance of mobile carriers. On the typical model for wiretaps, a government presents the carrier with some form of legal process specifying which accounts or lines are targeted for surveillance, and the company then provides those communications to the government.  As the European telecom Vodaphone disclosed last summer, however, some governments insist on being granted “direct access” to the stream of communications so that they can conduct their wiretaps without going through the carrier.  The latter architecture, of course, is far more susceptible to abuse, because it removes the only truly independent, nongovernmental layer of review from the collection process. A spy agency that wished to abuse its power under the former model—by conducting wiretaps without legal authority or inventing pretexts to target political opponents—would at least have to worry that lawyers or technicians at the telecommunications provider might detect something amiss. But any entity armed with mobile encryption keys effectively enjoys direct access: they can vacuum up cellular signals out of the air and listen to any or all of the calls they intercept, subject only to internal checks or safeguards. 

There are, to be sure, times when going to the target’s carrier with legal process is not a viable option—because the company is outside the jurisdiction of the United States or our allies. Stealing phone keys in bulk is certainly a much easier solution to that problem than crafting interception strategies tailored to either the specific target or specific uncooperative foreign carriers. Unfortunately, the most convenient solution in this case is also a solution that gives the United States (or at least its intelligence community) a vested interest in the systematic insecurity of global communications infrastructure. We hear a great deal lately about the value of information sharing in cybersecurity: Well, here’s a case where NSA had information that the technology American citizens and companies rely on to protect their communications was not only vulnerable, but had in fact been compromised. Their mission is supposed to be to help us secure our communications networks—but having chosen the easy solution to the problem of conducting cellular wiretaps, their institutional incentives are to do just the opposite.

Finally, this is one more demonstration that proposals to require telecommunications providers and device manufacturers to build law enforcement backdoors in their products are a terrible, terrible idea. As security experts have rightly insisted all along, requiring companies to keep a repository of keys to unlock those backdoors makes the key repository itself a prime target for the most sophisticated attackers—like NSA and GCHQ. It would be both arrogant and foolhardy in the extreme to suppose that only “good” attackers will be successful in these efforts. 

Pages