Topic: Telecom, Internet & Information Policy

Congress Moves against NSA Spying

Ars Technica reports that an amendment to the FY 2008 Intelligence Authorization Act “upholds the 1978 Foreign Intelligence Surveillance Backed (FISA) as the only means by which to do electronic surveillance—and … requires continuous judicial oversight of requests.”

Divided government is a real boon.

Google on Anonymizing Server Logs

Here’s Google’s Global Privacy Counsel Peter Fleischer discussing in more detail Google’s recent laudable decision to anonymize its server logs after 18-24 months. The discussion helps illustrate the diverse interests that must be balanced in choosing how long to maintain information.

It’s often easy to disregard the value that deep wells of raw information have for information-based business. Fleischer explains some of how Google makes use of data to improve its services and protect users. These consumer-beneficial activities must be balanced against the background demand for privacy protection.

Of particular note, of course, is his discussion of the emerging government demands for data retention (some of which conflict with government demands for data destruction). Data retention mandates are outsourced government surveillance, neatly shifting the cost of surveillance to the private sector while avoiding limits on government action like the Fourth Amendment and Privacy Act (in the case of the U.S.). Too put a fine point on it, data retention is bad.

This explication of Google’s thinking is a welcome contribution to public understanding. I did get a little chirping on my B.S. detector where Fleischer says he had talked to privacy activists in developing their plans. I’d like to know which ones. It’s a small enough community that I figure I would have known about it (I say at the risk of sounding self-important).

I’ve been aware in the past of government agencies deluding themselves about taking privacy into consideration because they’ve heard from government contractors selling “privacy enhancing technologies” like immutable audit logs and such. As often as not, this stuff is lipstick on a pig - seeking to make bad surveillance programs acceptable by tacking on complex, fallible privacy protections.

I’m sure Google has done better than that in its consultations with privacy experts. At least, I hope I’m sure.

DHS Privacy Committee Declines to Endorse REAL ID

The Department of Homeland Security’s Data Privacy and Integrity Advisory Committee is filing comments on the REAL ID regulations. Comments close today (Tuesday). Instructions for commenting can be found here, and apparently, due to difficulties with the automatic comment system and with receiving faxes, DHS has opened an email address for receiving comments: oscomments [at] dhs [dot] gov (subject: DHS-2006-0030) . Emails must have “DHS-2006-0030” in the subject line.

The Committee took care to offer constructive ideas, but the most important takeaway is summarized by Ryan Singel at Threat Level:

The Department of Homeland Security’s outside privacy advisors explicitly refused to bless proposed federal rules to standardize states’ driver’s licenses Monday, saying the Department’s proposed rules for standardized driver’s licenses – known as Real IDs – do not adequately address concerns about privacy, price, information security, redress, “mission creep”, and national security protections.”Given that these issues have not received adequate consideration, the Committee feels it is important that the following comments do not constitute an endorsement of REAL ID or the regulations as workable or appropriate,” the committee wrote in the introduction to their comments for the rulemaking record.

I’ll be testifying on REAL ID today before the Senate Judiciary Committee.

Congress Backs Official Idiocy

Here’s Congress siding with Boston’s idiotic public officials. The Terrorist Hoax Improvements Act of 2007 would allow government officials to sue people who fail to promptly clear things up when those officials mistakenly think that they have stumbled over a terrorist plot.

There’s nothing in the bill allowing individuals or corporations to sue government officials when hare-brained overreactions interfere with their lives and business or destroy their property.

Digg, Hacking, and Civil Disobedience

Randy Picker asks when civil disobedience is acceptable, and concludes that posting HD-DVD encryption keys doesn’t cut it:

I wouldn’t think that not being able to play an encrypted high-definition DVD on your platform of choice would fall into that category. I understand fully that people disagree about whether digital rights management and the Digital Millennium Copyright Act are good copyright policy. I also understand that users can be frustrated by limitations imposed by DRM (I’ve run into those myself). But I think the DMCA (and the DRM that it makes possible) is a long, long way from the sorts of laws for which civil disobedience is an appropriate response. Simply not liking the law is not enough. There must be more, something that recognizes the nature of reasonable disagreement over law, and the range of possible legitimate variations about those laws.

Ed Felten points out some of the reasons that geeks felt so strongly about this case. Partly it was geeks’ knee-jerk opposition to censorship. Partly it’s a protest against the DMCA.

There are a variety of reasons that the DMCA is bad public policy. I presented some of them in a paper I did for Cato last year. But instead of rehashing those arguments, let me quote an excellent essay by Paul Graham about America’s heritage of hacking. Prof. Picker dismissively characterizes this week’s incident as a dispute over “being able to play an encrypted high-definition DVD on your platform of choice,” but from the perspective of computer programmers it’s about something more fundamental than that:

Hacking predates computers. When he was working on the Manhattan Project, Richard Feynman used to amuse himself by breaking into safes containing secret documents. This tradition continues today. When we were in grad school, a hacker friend of mine who spent too much time around MIT had his own lock picking kit. (He now runs a hedge fund, a not unrelated enterprise.)

It is sometimes hard to explain to authorities why one would want to do such things. Another friend of mine once got in trouble with the government for breaking into computers. This had only recently been declared a crime, and the FBI found that their usual investigative technique didn’t work. Police investigation apparently begins with a motive. The usual motives are few: drugs, money, sex, revenge. Intellectual curiosity was not one of the motives on the FBI’s list. Indeed, the whole concept seemed foreign to them.

Those in authority tend to be annoyed by hackers’ general attitude of disobedience. But that disobedience is a byproduct of the qualities that make them good programmers. They may laugh at the CEO when he talks in generic corporate newspeech, but they also laugh at someone who tells them a certain problem can’t be solved. Suppress one, and you suppress the other…

It is by poking about inside current technology that hackers get ideas for the next generation. No thanks, intellectual homeowners may say, we don’t need any outside help. But they’re wrong. The next generation of computer technology has often — perhaps more often than not — been developed by outsiders.

In 1977 there was, no doubt, some group within IBM developing what they expected to be the next generation of business computer. They were mistaken. The next generation of business computer was being developed on entirely different lines by two long-haired guys called Steve in a garage in Los Altos. At about the same time, the powers that be were cooperating to develop the official next generation operating system, Multics. But two guys who thought Multics excessively complex went off and wrote their own. They gave it a name that was a joking reference to Multics: Unix.

The latest intellectual property laws impose unprecedented restrictions on the sort of poking around that leads to new ideas. In the past, a competitor might use patents to prevent you from selling a copy of something they made, but they couldn’t prevent you from taking one apart to see how it worked. The latest laws make this a crime. How are we to develop new technology if we can’t study current technology to figure out how to improve it?

Why are programmers so violently opposed to these laws? If I were a legislator, I’d be interested in this mystery — for the same reason that, if I were a farmer and suddenly heard a lot of squawking coming from my hen house one night, I’d want to go out and investigate. Hackers are not stupid, and unanimity is very rare in this world. So if they’re all squawking, perhaps there is something amiss.

Could it be that such laws, though intended to protect America, will actually harm it? Think about it. There is something very American about Feynman breaking into safes during the Manhattan Project. It’s hard to imagine the authorities having a sense of humor about such things over in Germany at that time. Maybe it’s not a coincidence.

Hackers are unruly. That is the essence of hacking. And it is also the essence of Americanness. It is no accident that Silicon Valley is in America, and not France, or Germany, or England, or Japan. In those countries, people color inside the lines.

Digging Piracy

Something rather astonishing happened on the Internet on Tuesday. Let me start with a bit of background: Hollywood has an encryption system called AACS that it uses to scramble the content on high-definiton home video discs. Like all copy protection systems, it only took a few months before hackers found security flaws in the system. In the process they extracted a 16-byte key (basically, a very long number) that allows programmers to unlock the encrypted content.

This key had been floating around various minor websites over the last couple of months. But last month, the organization that controls the AACS system began sending cease and desist letters to various ISPs demanding that the keys be taken down from websites that were displaying them. In response, people all over the web began posting copies of the key, which is just a 16-character string.

One of the sites that had the key on it is Digg. Digg is a news website in which all of the news stories are chosen by the collective wisdom of readers. Anyone can submit a story, and then other users can vote for (called “digging”) or against (called “burying”) individual stories. The stories that get the most votes get promoted to the front page where they’re viewed by hundreds of thousands of people.

Somebody posted a story containing the AACS key, and Digg got a letter demanding that the story be removed. Digg complied. Princeton computer science professor Ed Felten describes what happened next:

Then Digg’s users revolted. As word got around about what Digg was doing, users launched a deluge of submissions to Digg, all mentioning or linking to the key. Digg’s administrators tried to keep up, but submissions showed up faster than the administrators could cancel them. For a while yesterday, the entire front page of Digg — the “hottest” pages according to Digg’s algorithms — consisted of links to the AACS key.

Last night, Digg capitulated to its users. Digg promised to stop removing links to the key, and Digg founder Kevin Rose even posted the key to the site himself.

Fred von Lohmann has a good rundown on the legal liability Digg could face from allowing the key to be posted on their site. But more interesting, I think, is the light the incident sheds on the broader debate over Internet piracy.

In a sense, Digg is a microcosm of the Internet at large. What makes the Internet so powerful is that we’re finding more and more clever ways to turn tasks that once required a human being over to machines. In the case of Digg, Kevin Rose found a way to automate the editing process. Instead of having a single human being read through all the stories and select the best ones, he created a system in which readers—who are on the site reading stories anyway—could quickly and easily choose stories for him. This has made the site extraordinarily successful.

But technology is amoral. A system that transmits news and information can just as easily be used to transmit pirated music, encryption keys, or even child pornography. Moreover, you can’t fine a computer algorithm or throw it in jail. Which means that as we automate more and more of our information-distribution systems, there are fewer and fewer ways for the legal authorities to exert control over what kinds of information is transmitted.

In the early days of the Internet, people created special-purpose tools like Napster whose primary use was to swap illicit materials. Copyright holders got those tools shut down. But increasingly, illicit information sharing is being done using the same tools we use to share legal content. Napster’s primary use was to share copyrighted music. But one of its successors, BitTorrent, is widely used to exchange legitimate content, including open source software, computer game updates, and even legitimate movie downloads. It would be unreasonable to outlaw BitTorrent, given how many legitimate uses it has.

On Tuesday, Kevin Rose had only two options: He could allow the encryption key to appear on his site, or he could shut his site down. Shutting his site down wasn’t really an option (it’s a multi-million dollar business) so his only real choice was to allow the content to be transmitted. As a society, we face precisely the same dilemma with regard to the Internet as a whole. People use the Internet to transmit information most of us think they shouldn’t be transmitting. But our only alternatives are to cripple the Internet or turn the country into a police state. Nobody wants to do either of those things, so we’re going to have to live with the fact that any information a significant number of people want to share is going to be shared. We’re going to have to find ways to adjust our copyright system to a world in which anyone who’s willing to break the law will be able to get most copyrighted content for free. As a supporter of copyright, this doesn’t make me happy. But there doesn’t seem to be anything we can do about it.

REAL ID Comment Campaign

The comment period on Department of Homeland Security regulations implementing the REAL ID Act ends early next week. A broad coalition of groups has put together a Web page urging people to submit their comments. The page has instructions for commenting, a quite helpful thing given how arcane the regulatory process is.

Feel free to comment – good, bad, or indifferent – on the regs. My views are known, but the Department of Homeland Security doesn’t know yours.