Tag: privacy

It’s Illegal to Say ‘None of Your Damn Business’

The government’s troops are rallying behind the Census Bureau’s American Community Survey. “After the House voted this month to defund a major part of the U.S. Census Bureau, the agency is taking the threat very seriously,” reports the Washington Times, “with its supporters in both business and government rallying to preserve the annual questionnaire.”

Wait. Who could be against the Census Bureau? Its constitutional charter is to enumerate citizens every ten years for the purpose of apportioning representation in Congress. This is a necessary and unremarkable administrative function.

Oh, wait—again. Government bloat is a law of gravity, and the Census Bureau does far, far more than count noses. Its American Community Survey has made the Census Bureau the research arm for the welfare/redistribution state and a source of corporate welfare in the form of demographic data about Americans.

So Census goes around asking people dozens of questions that have nothing to do with the agency’s constitutional purpose.

The ACS is controversial enough among the strongly principled that Census has a Web page entitled: “Is the American Community Survey legitimate?” Their answer: “Yes. The American Community Survey is legitimate. It is a survey conducted by the U.S. Census Bureau.” (Did you know there’s a whole class on the “appeal to authority” at Fallacy University?…)

The real authority they cite is Title 13 of the U.S. code, which, in section 221, allows the government to fine people who refuse to answer the Census Bureau’s questions. It’s illegal to say “none of your damn business” when a government official comes around asking about your toilet. I’ve written many times, in long form and short, that the helping hand of government strips away privacy before it goes to work.

So it’s nice to see that Rand Paul (R-KY) in the Senate and Ted Poe (R-TX) in the House have introduced a bill to make the American Community Survey voluntary, unless it’s a question that the Census actually needs for its constitutional purposes. Reading public comments on the House bill is particularly interesting. There is a good number of people who want to be left well enough alone. They shouldn’t be subject to penalties for saying so. It’s a matter of principle and privacy.

I Second That Skepticism

The ACLU’s Chris Calabrese notes that nominations to the Privacy and Civil Liberties Board were forwarded from the Senate Judiciary Committee to the full Senate this morning. Congress created the Board in August 2007, and we have waited, and waited, and waited while the Bush and Obama administrations neglected to appoint anyone to it.

Calabrese is rightly skeptical that the “PCLOB” can make a difference:

[T]he national security establishment is huge, with tens of thousands of employees and a budget of more than $60 billion. The NSA alone has more than 30,000 employees. Contrast that with the PCLOB. It’s currently authorized (if it finally gets filled) to spend a whopping $900,000 and hire ten full-time employees for the 2012 fiscal year. With this level of staffing, it’s hard to imagine that the Board and its investigators can even begin to understand this vast national security infrastructure, never mind properly oversee it.

I have a fair amount of experience with privacy oversight in the U.S. government, having served on the Department of Homeland Security’s Data Privacy and Integrity Advisory Committee. That experience has fairly well validated my thinking in 2001, before there were “privacy officers”:

The appointment of a privacy czar or creation of a privacy office is a poor substitute for directly addressing the voraciousness of many government programs for citizens’ personal information. Political leaders themselves should incorporate privacy into their daily consideration of policy options, rather than farming out that responsibility to officials who may or may not have a say in government policy.

To see how the PCLOB fits into government thinking, we can look at a 2007 speech given by Donald Kerr, principal deputy director of National Intelligence. To him, “privacy” is giving the government access to all the data it wants, subject to oversight.

[P]rivacy, I would offer, is a system of laws, rules, and customs with an infrastructure of Inspectors General, oversight committees, and privacy boards on which our intelligence community commitment is based and measured. And it is that framework that we need to grow and nourish and adjust as our cultures change.

That’s not privacy.

So don’t think for a minute that privacy will be better protected with a PCLOB in place, except perhaps marginally in the few programs that the Board dips into.

The membership of the board is slated to be: Jim Dempsey of the Center for Democracy and Technology, a sincere and knowledgeable privacy player, whose “player” role I find incompatible with producing good privacy outcomes; Elisebeth Collins Cook, a former Department of Justice lawyer who I had never heard of before her nomination; Rachel Brand, an attorney for the U.S. Chamber of Commerce also unknown to me; Patricia Wald, a former federal judge for the D.C. Circuit whose privacy work is unknown to me; and David Medine, currently a WilmerHale partner who will chair the board. Medine is unquestionably government-friendly. He was a Federal Trade Commission bureaucrat who helped draft the Gramm-Leach-Bliley financial privacy and the Children’s Online Privacy Protection Act (COPPA) regulations.

On Breach of Decorum and Government Growth

Last week, the Center for Democracy and Technology changed its position on CISPA, the Cyber Intelligence Sharing and Protection Act, two times in short succession, easing the way for House passage of a bill profoundly threatening to privacy.

Declan McCullagh of C|Net wrote a story about it called “Advocacy Group Flip-Flops Twice Over CISPA Surveillance Bill.” In it, he quoted me saying: “A lot of people in Washington, D.C. think that working with CDT means working for good values like privacy. But CDT’s number one goal is having a seat at the table. And CDT will negotiate away privacy toward that end.”

That comment netted some interesting reactions. Some were gleeful about this “emperor-has-no-clothes” moment for CDT. To others, I was inappropriately “insulting” to the good people at CDT. This makes the whole thing worthy of further exploration. How could I say something mean like that about an organization whose staff spend so much time working in good faith on improving privacy protections? Some folks there absolutely do. This does not overcome the institutional role CDT often plays, which I have not found so creditable. (More on that below. Far below…)

First, though, let me illustrate how CDT helped smooth the way for passage of the bill:

Congress is nothing if not ignorant about cybersecurity. It has no idea what to do about the myriad problems that exist in securing computers, networks, and data. So its leaders have fixed on “information sharing” as a panacea.

Because the nature and scope of the problems are unknown, the laws that stand in the way of relevant information sharing are unknown. The solution? Scythe down as much law as possible. (What’s actually needed, most likely, is a narrow amendment to ECPA. Nothing of the sort is yet in the offing.) But this creates a privacy problem: an “information sharing” bill could facilitate promiscuous sharing of personal information with government agencies, including the NSA.

On the House floor last week, the leading Republican sponsor of CISPA, Mike Rogers (R-MI), spoke endlessly about privacy and civil liberties, the negotiations, and the process he had undertaken to try to resolve problems in the privacy area. At the close of debate on the rule that would govern debate on the bill, he said:

The amendments that are following here are months of negotiation and work with many organizations—privacy groups. We have worked language with the Center for Democracy and Technology, and they just the other day said they applauded our progress on where we’re going with privacy and civil liberties. So we have included a lot of folks.

You see, just days before, CDT had issued a blog post saying that it would “not oppose the process moving forward in the House.” The full text of that sentence is actually quite precious because it shows how little CDT got in exchange for publicly withdrawing opposition to the bill. Along with citing “good progress,” CDT president and CEO Leslie Harris wrote:

Recognizing the importance of the cybersecurity issue, in deference to the good faith efforts made by Chairman Rogers and Ranking Member Ruppersberger, and on the understanding that amendments will be considered by the House to address our concerns, we will not oppose the process moving forward in the House.

Cybersecurity is an important issue—nevermind whether the bill would actually help with it. The leadership of the House Intelligence Committee have acted in good faith. And amendments will evidently be forthcoming in the House. So go ahead and pass a bill not ready to become law, in light of “good progress.”

Then CDT got spun.

As McCullagh tells it:

The bill’s authors seized on CDT’s statement to argue that the anti-CISPA coalition was fragmenting, with an aide to House Intelligence Committee Chairman Mike Rogers (R-Mich.) sending reporters e-mail this morning, recalled a few minutes later, proclaiming: “CDT Drops Opposition to CISPA as Bill Moves to House Floor.” And the Information Technology Industry Council, which is unabashedly pro-CISPA, said it “applauds” the “agreement between CISPA sponsors and CDT.”

CDT quickly reversed itself, but the damage was done. Chairman Rogers could make an accurate but misleading floor statement omitting the fact that CDT had again reversed itself. This signaled to members of Congress and their staffs—who don’t pay close attention to subtle shifts in the views of organizations like CDT—that the privacy issues were under control. They could vote for CISPA without getting privacy blow-back. Despite furious efforts by groups like the Electronic Frontier Foundation and the ACLU, the bill passed 248 to 168.

Defenders of CDT will point out—accurately—that it argued laboriously for improvements to the bill. And with the bill’s passage inevitable, that was an essential benefit to the privacy side.

Well, yes and no. To get at that question, let’s talk about how groups represent the public’s interests in Washington, D.C. We’ll design a simplified representation game with the following cast of characters:

  • one powerful legislator, antagonistic to privacy, whose name is “S.R. Veillance”;
  • twenty privacy advocacy groups (Groups A through T); and
  • 20,000 people who rely on these advocacy groups to protect their privacy interests.

At the outset, the 20,000 people divide their privacy “chits”—that is, their donations and their willingness to act politically—equally among the groups. Based on their perceptions of the groups’ actions and relevance, the people re-assign their chits each legislative session.

Mr. Veillance has an anti-privacy bill he would like to get passed, but he knows it will meet resistance if he doesn’t get 2,500 privacy chits to signal that his bill isn’t that bad. If none of the groups give him any privacy chits, his legislation will not pass, so Mr. Veillance goes from group to group bargaining in good faith and signaling that he intends to do all he can to pass his bill. He will reward the groups that work with him by including such groups in future negotiations on future bills. He will penalize the groups that do not by excluding them from future negotiations.

What we have is a game somewhat like the prisoner’s dilemma in game theory. Though it is in the best interest of the society overall for the groups to cooperate and hold the line against a bill, individual groups can advantage themselves by “defecting” from the interests of all. These defectors will be at the table the next time an anti-privacy bill is negotiated.

Three groups—let’s say Group C, Group D, and Group T—defect from the pack. They make deals with Mr. Veillance to improve his bill, and in exchange they give him their privacy chits. He uses their 3,000 chits to signal to his colleagues that they can vote for the bill without fear of privacy-based repercussions.

At the end of the first round, Mr. Veillance has passed his anti-privacy legislation (though weakened, from his perspective). Groups C, D, and T did improve the bill, making it less privacy-invasive than it otherwise would have been, and they have also positioned themselves to be more relevant to future privacy debates because they will have a seat at the table. Hindsight makes the passage of the bill look inevitable, and CDT looks all the wiser for working with Sir Veillance while others futilely opposed the bill.

Thus, having defected, CDT is now able to get more of people’s privacy chits during the next legislative session, so they have more bargaining power and money than other privacy groups. That bargaining power is relevant, though, only if Mr. Veillance moves more bills in the future. To maintain its bargaining power and income, it is in the interest of CDT to see that legislation passes regularly. If anti-privacy legislation never passes, CDT’s unique role as a negotiator will not be valued and its ability to gather chits will diminish over time.

CDT plays a role in “improving” individual pieces of legislation to make them less privacy-invasive and it helps to ensure that improved—yet still privacy-invasive—legislation passes. Over the long run, to keep its seat at the table, CDT bargains away privacy.

This highly simplified representation game repeats itself across many issue-dimensions in every bill, and it involves many more, highly varied actors using widely differing influence “chits.” The power exchanges and signaling among parties ends up looking like a kaleidoscope rather than the linear story of an organization subtly putting its own goals ahead of the public interest.

Most people working in Washington, D.C., and almost assuredly everyone at CDT, have no awareness that they live under the collective action problem illustrated by this game. This is why government grows and privacy recedes.

In his article, McCullagh cites CDT founder Jerry Berman’s role in the 1994 passage of CALEA, the Communications Assistance to Law Enforcement Act. I took particular interest in CDT’s 2009 backing of the REAL ID revival bill, PASS ID. In 2006, CDT’s Jim Dempsey helped give privacy cover to the use of RFID in identification documents contrary to the principle that RFID is for products, not people. A comprehensive study of CDT’s institutional behavior to confirm or deny my theory of its behavior would be very complex and time-consuming.

But divide and conquer works well. My experience is that CDT is routinely the first defector from the privacy coalition despite the earnest good intentions of many individual CDTers. And it’s why I say, perhaps in breach of decorum, things like: “A lot of people in Washington, D.C. think that working with CDT means working for good values like privacy. But CDT’s number one goal is having a seat at the table. And CDT will negotiate away privacy toward that end.”

Cybersecurity Bills? No, Thanks

Prominent academics, experienced engineers, and professionals published an open letter to Congress yesterday, stating their opposition to CISPA and other overly broad cybersecurity bills. Highlight:

We take security very seriously, but we fervently believe that strong computer and network security does not require Internet users to sacrifice their privacy and civil liberties. The bills currently under consideration, including Rep. Rogers’ Cyber Intelligence Sharing and Protection Act of 2011 (H.R. 3523) and Sen. McCain’s SECURE IT Act (S. 2151), are drafted to allow entities who participate in relaying or receiving Internet traffic to freely monitor and redistribute those network communications. The bills nullify current legal protections against wiretapping and similar civil liberties violations for that kind of broad data sharing. By encouraging the transfer of users’ private communications to US Federal agencies, and lacking good public accountability or transparency, these “cybersecurity” bills unnecessarily trade our civil liberties for the promise of improved network security.

Cato’s recent Capitol Hill briefing on cybersecurity covered many similar points, and additional ones, too. CISPA and three other bills are scheduled for consideration on the House floor this week.

Cybersecurity: Talking Points vs. Substance

In the late stages of a legislative battle, it often comes down to “talking points.” Whoever puts out the message that sticks wins the debate—damn the substance.

Rep. Mike Rogers (R-MI) is prioritizing talking points over substance if a CQ report about a speech he gave to the Ripon Society is accurate. (He put it up on his Web site, from which one could infer endorsement. Rogers is not a cosponsor of SOPA, the Stop Online Piracy Act, so let’s not have the government taking down the house.gov domain just now, mkay?)

From the report:

“We’re finding language we can agree on,” he said in a speech to the Ripon Society, a moderate Republican group. “Are we going to agree on everything? Probably not. They don’t want anything, anytime, ever.” But, Rogers said, he hopes to give the groups “language that at least allows them to sleep at night, because I can’t sleep at night over these threats.”

This seems to suggest that a few tweaks to language, well in the works with the privacy community, will make his version of cybersecurity legislation a fait accompli. I’m a keen observer of the privacy groups, and I see no evidence that this is so. The bill is so broadly written that it is probably unrepairable.

And that is a product of Congress’s approach to this problem: Congress does not know how to address the thousands of difference problems that fall under the umbrella term “cybersecurity,” so it has fixed on promiscuous (and legally immunized) “information sharing” with government security agencies as the “solution.” Privacy can rightly be traded for other goods such as security, but with no benefits discernible from wanton information sharing, one shouldn’t expect sign-off from the privacy community.

That is not actually the message of the privacy community, who, on average, trust the government more than most conservatives and libertarians. The mainstream privacy community probably would accept highly regulatory and poorly formed cybersecurity legislation if it had enough privacy protections. But Rogers’ talking points try to push privacy folk onto the “unreasonable” part of the chess board, saying, “They don’t want anything, anytime, ever.”

That’s closer to my view than anything the orthodox privacy advocates are saying. Cybersecurity is not an area where the federal government can do much to help. But even I said in my 2009 testimony to the House Science Committee that the federal government has a role in improving cybersecurity: being a smart consumer that influences technology markets for the better.

What Representative Rogers—and all advocates for cybersecurity legislation—have failed to do is to make the affirmative case for their bills. “I can’t sleep at night” is not an answer to the case, carefully made by Jerry Brito of the Mercatus Center at Cato’s recent Hill briefing, that the threat from cyberattacks is overblown.

The briefing was called “Cybersecurity: Will Federal Regulation Help?” That’s a place one can go for substance.

From Cybercrime Statistics to Cyberspying

Someone finally decided to examine “cybercrime” statistics, and here’s what they found:

The cybercrime surveys we have examined exhibit [a] pattern of enormous, unverified outliers dominating the data. In some, 90 percent of the estimate appears to come from the answers of one or two individuals. In a 2006 survey of identity theft by the Federal Trade Commission, two respondents gave answers that would have added $37 billion to the estimate, dwarfing that of all other respondents combined. This is not simply a failure to achieve perfection or a matter of a few percentage points; it is the rule, rather than the exception. Among dozens of surveys, from security vendors, industry analysts and government agencies, we have not found one that appears free of this upward bias.

That’s Dinei Florêncio and Cormac Herley of Microsoft Research in a New York Times piece entitled: “The Cybercrime Wave That Wasn’t.”

You see, cybercrime statistics have been generated using surveys of individuals and businesses, but you can’t generate valid numerical results that way. An opinion poll’s errors will naturally cancel out—there are a roughly equal number of wrongly stated “thumbs-up”s and “thumbs-down”s.

When you ask people to estimate losses, though, they can never estimate less than zero, so errors will always push results to the high side. High-side errors extrapolated society-wide drive the perception that cybercrime is out of control.

There are more drivers of excess insecurity than just bad loss estimates. There are also data breach notification laws, which require data holders to report various kinds of personal data spillage. These reports are the high-tech, grown-up version of a favorite schoolyard taunt: “Your epidermis is showing!” Epidermis is, of course, a scientific name for skin. It often doesn’t matter that one’s epidermis is showing. The questions are: What part of the epidermis? And what social or economic consequences does it have?

Most breached data is put to no use whatsoever. A 2005 study of data breaches found the highest fraudulent misuse rate for all breaches under examination to be 0.098 percent—less than one in 1,000 identities. (The Government Accountability Office concurs that misuse of breached data is rare.) Larger breaches tend to have lower misuse rates, which makes popular reporting on gross numbers of personal data breaches misleading. Identity frauds are limited by the time and difficulty of executing them, not by access to data.

Why does excess cyber-insecurity matter? Doesn’t it beneficially drive companies to adopt better security practices for personal data?

It undoubtedly does, but security is not costless, and money driven to data security measures comes from other uses that might do more to make consumers better off. More importantly, though, data breach agitation and distended crime statistics have joined with other cybersecurity hype to generate a commitment in Congress to pass cybersecurity legislation.

Cybersecurity bills pending in both the House and Senate could have gruesome consequences for privacy because of “information sharing” provisions that immunize companies sharing data with the government for cybersecurity purposes. The potential for a huge, lawless cyberspying operation is significant if anyone can feed data to the government free of liability, including the privacy protections in property law, torts, and contract. Congress would not improve things by regulating in the name of cybersecurity, and it just might make things a lot worse.

It is ironic that overwrought claims about cybercrime and data breach could be privacy’s undoing, but they just might.

The Census’ Broken Privacy Promise

When the 1940 census was collected, the public was reassured that the information it gathered would be kept private. “No one has access to your census record except you,” the public was told. President Franklin Roosevelt said: “There need be no fear that any disclosure will be made regarding any individual or his affairs.”

Apparently the limits of what the government can do with census information have their limits. Today the 1940 census goes online.

When the Census Bureau transferred the data to the National Archives, it agreed to release of the data 72 years after its collection. So much for those privacy promises.

Adam Marcus of Tech Freedom writes on C|Net:

Eighty-seven percent of Americans can find a direct family link to one or more of the 132+ million people listed on those rolls. The 1940 census included 65 questions, with an additional 16 questions asked of a random 5 percent sample of people. You can find out what your father did, how much he made, or if he was on the dole. You may be able to find out if your mother had an illegitimate child before she married your father.

To be sure, this data will open a fascinating trove for researchers into life 70 years ago. But the Federal Trade Commission would not recognize a “fascinating trove” exception if a private company were to release data it had collected under promises of confidentiality.

Government officials endlessly point the finger at the private sector for being a privacy scourge. Senator Al Franken did last week in a speech to the American Bar Association last week (text; Fisking). He’s the chairman of a Senate subcommittee dedicated to examining the defects in private sector information practices. Meanwhile, the federal government is building a massive data and analysis center to warehouse information hoovered from our private communications, and the Obama Administration recently extended to five years the amount of time it can retain private information about Americans under no suspicion of ties to terrorism.

Marcus has the bare minimum lesson to take from this episode: “Remember this in 2020.”