The White House has issued a threat to veto the Cyber Intelligence Information Sharing Protection Act (CISPA) in its current form, despite recent amendments aimed at assuaging the concerns of privacy and civil liberties advocates:
H.R. 3523 fails to provide authorities to ensure that the Nation’s core critical infrastructure is protected while repealing important provisions of electronic surveillance law without instituting corresponding privacy, confidentiality, and civil liberties safeguards. For example, the bill would allow broad sharing of information with governmental entities without establishing requirements for both industry and the Government to minimize and protect personally identifiable information. Moreover, such sharing should be accomplished in a way that permits appropriate sharing within the Government without undue restrictions imposed by private sector companies that share information.
The bill also lacks sufficient limitations on the sharing of personally identifiable information between private entities and does not contain adequate oversight or accountability measures necessary to ensure that the data is used only for appropriate purposes. Citizens have a right to know that corporations will be held legally accountable for failing to safeguard personal information adequately. The Government, rather than establishing a new antitrust exemption under this bill, should ensure that information is not shared for anti-competitive purposes.
Unfortunately, as Paul Rosenzweig notes, the other main reason for the administration’s opposition is that the bill doesn’t grant the government enough regulatory power over “critical infrastructure” computer networks. Still, this seems like an opportunity to pause and consider what an acceptable cybersecurity information sharing bill might look like. Because notwithstanding all the hype, there are genuine risks and vulnerabilities that might be mitigated by better information sharing—and that may indeed require Congressional action. But a narrowly tailored approach that respects privacy and civil liberties will look very different from CISPA.
As I explained in a post last year, CISPA worked by creating a sweeping exception to all other privacy and surveillance laws, granting blanket immunity to any “entity” that chose to share vaguely defined “cyber threat information”—potentially including the contents of e-mails or other online communications—with both private actors and the government. When civil liberties advocates cried foul at the prospect of such vast quantities of private data being handed over to government on a silver platter, the bill’s supporters tried to placate them by tacking on an array of after-the-fact anonymization requirements and use restrictions—forbidding the use of the data except for a “cybersecurity purpose” or for “the protection of the national security of the United States.”
That wasn’t much consolation to anyone who’s watched how the government has tried to interpret similar “purpose” restrictions in the past. In 2002, for example, then–Solicitor General Ted Olson argued for a highly expansive view of the “foreign intelligence purposes” for which information obtained through national security wiretaps could be used, including using evidence of misconduct unrelated to terrorism or espionage to force people to become informants. If a wiretap turned up evidence of tax evasion or rape, for instance, Olson suggested the government “could go to that individual and say we’ve got this information and we’re prosecuting and you might be able to help us. I don’t want to foreclose that.” It’s no great leap to imagine a future solicitor general arguing that extorting the cooperation of hackers, penetration testers, or other tech professionals would similarly serve a “cybersecurity purpose.”
Yet it shouldn’t be that hard to craft legislation that would allow sharing of the broad categories of information that are most useful for improving security but don’t raise privacy or civil liberties concerns. Here’s a crazy idea: Instead of indiscriminately adding a cybersecurity loophole to every statute on the books, why not figure out which specific kinds of information are useful to security professionals without compromising privacy, figure out which laws raise obstacles to that sharing, and then craft appropriately narrow exemptions? (One assumes the intelligence agencies can be afforded more discretion about when to share the information already in their own possession—whatever else one might say about it, “oversharing” is not among the NSA’s problems.)
The exceptions could be appropriately narrowly tailored depending on the sensitivity of the information involved. For instance, different sections of the Electronic Communications Privacy Act deal with different kinds of data. Subsections (1) and (2) of 18 USC §2702 deal with the contents of communications in transit through or stored by a communications provider, generally prohibiting use or disclosure of that information without specific consent. Subsection (3) covers subscriber information and transactional data about those communications, and generally permits voluntary sharing, but specifically prohibits sharing with governmental entities. Since that transactional information is typically less sensitive than communications themselves, an exemption there might allow providers a fair amount of discretion to determine what constitutes “cyber threat information” and permit sharing with government also, subject to the appropriate anonymization and use requirements. For the more sensitive contents, the exception might be limited to a relatively specific laundry list of kinds of data that are both unquestionably security-related and limited in their implications for privacy, such as malware signatures and attack payloads.Those who worship at the altar of “tech neutrality“ complain that this would limit the flexibility of the law over time, requiring Congress to revisit and revise the list as technology and the nature of the threat evolve. But if the alternative is barely-constrained permission to start shoveling sensitive private information into the government’s maw—precisely the kind of large scale “data breach” that “cybersecurity” is supposed to prevent—having to tweak the language once or twice a decade seems like a reasonable price to pay.