Tag: electronic surveillance

Wyden Pressing Intel Officials on Domestic Location Tracking

Back in May, during the debates over reauthorization of the Patriot Act, Sens. Ron Wyden (D-OR) and Mark Udall (D-CO) began raising a fuss about a secret interpretation of the law’s so-called “business records” authority, known to wonks as Section 215, arguing that intelligence agencies had twisted the statute to give themselves domestic surveillance powers Congress had not anticipated or intended. At the time, I marshaled a fair amount of circumstantial evidence that, I thought, suggested that the “secret authority” involved location tracking of cell phones. Wyden backed off after being promised a secret hearing to address his concerns—but indicated he’d be returning to the issue if he remained unsatisfied. The hearing occurred early last month. Now I suspect we’re seeing the other shoe dropping.

At a confirmation hearing this morning for Matthew Olsen, who’s been tapped to head the National Counterterrorism Center, Wyden repeatedly asked the nominee whether the intelligence community “use[s] cell site data to track the location of Americans inside the country.” This comes on the heels of a letter Wyden and Udall sent to Director of National Intelligence James Clapper demanding an answer to the same question. Olsen was unsurprisingly vague, calling it a “complicated question” but allowing that there were “certain circumstances where that authority may exist.” The committee was promised a memo explaining those “circumstances” by September. That means that just about ten years after Congress approved the Patriot Act, a handful of legislators may get the privilege of learning what it does. Ah, democracy.

On a related note, one of the data points I cited in my previous post was that Wyden’s Geolocation Privacy and Surveillance Act had, somewhat unusually, been structured primarily as a reform to the Foreign Intelligence Surveillance Act (FISA), which governs intelligence spying, only later incorporating the same protections into the statutes governing ordinary criminal investigations. Especially striking was the inclusion of a specific prohibition on the use of Section 215 for location tracking, above and beyond the general warrant requirement. Since that writing, however, the bill gained Republican co-sponsorship, and dropped the changes to FISA that had previously been the bill’s centerpiece. Instead, the bill now contains an explicit exception for FISA “electronic surveillance,” in addition to the section providing for location tracking authorized by either a criminal or a FISA warrant. I’m not privy to whatever negotiations necessitated that change, but it’s hard to imagine anyone would have insisted on such a substantial restructuring if the intelligence community weren’t doing at least some location tracking pursuant to a lower standard than probable cause.

It’s not entirely clear exactly what the current version of the bill would permit, however. FISA is mentioned twice in the draft: once as part of a vague general exemption for “electronic surveillance,” and then again as one of the sources of authority for a “warrant” to do geolocation tracking. At a first pass, though, those two definitions ought to overlap, because FISA requires a secret intelligence court to issue a warrant based on probable cause (to believe the target is an “agent of a foreign power”) for government monitoring that falls within the FISA’s definition of “electronic surveillance,” in contrast with the far laxer standards that apply to the use of Section 215. It’s therefore an interesting puzzle what, exactly, that exception is meant to permit. Possibly the idea is to permit the (otherwise prohibited) “use” and “disclosure” of geolocation information already obtained without a warrant in order to target future judicially authorized “electronic surveillance,” but it’s hard to be sure. What does seem increasingly sure, however, is that location tracking is connected to the controversy over Section 215—and that Congress owes the American people a debate over the proper use and scope of that power, which it has thus far refused to have.

Top NSA Mathematician: ‘I should apologize to the American people. It’s violated everyone’s rights.’

If you’re a telecommunications firm that helped the National Security Agency illegally spy on your customers without a court order, Sen. Barack Obama will happily vote for legislation he once promised to filibuster in order to secure retroactive immunity. If you’re implicated in the use of torture as an interrogation tactic, you can breathe easy knowing President Barack Obama thinks it’s in the country’s best interests to “look forward, not back.”  But if you were a government official spurred by conscience to blow the whistle on government malfeasance or ineptitude in the war on terror?  As Jane Mayer details in a must-read New Yorker article, you’d better watch out! This administration is shattering records for highly selective prosecutions under the espionage act—and the primary criteria seems to be, not whether national security was harmed in any discernible way by your disclosures, but by the degree of embarrassment they caused the government.

The whole thing is fascinating, but I’m especially interested in the discussion of how electronic surveillance tools that came with built-in privacy controls were tossed in favor of more indiscriminate programs that, by the way, didn’t work and generated huge cost overruns. The most striking quotations come from disillusioned Republican intelligence officials. Here’s Bill Binney, a top NSA mathematician and analyst, on the uses to which his work was put:

Binney expressed terrible remorse over the way some of his algorithms were used after 9/11. ThinThread, the “little program” that he invented to track enemies outside the U.S., “got twisted,” and was used for both foreign and domestic spying: “I should apologize to the American people. It’s violated everyone’s rights. It can be used to eavesdrop on the whole world.”

One GOP staffer on the House Intelligence Committee recounted an exchange with then-NSA head Michael Hayden:

[Diane] Roark, who had substantial influence over N.S.A. budget appropriations, was an early champion of Binney’s ThinThread project. She was dismayed, she says, to hear that it had evolved into a means of domestic surveillance, and felt personally responsible. Her oversight committee had been created after Watergate specifically to curb such abuses. “It was my duty to oppose it,” she told me. “That is why oversight existed, so that these things didn’t happen again. I’m not an attorney, but I thought that there was no way it was constitutional.” [….] She asked Hayden why the N.S.A. had chosen not to include privacy protections for Americans. She says that he “kept not answering. Finally, he mumbled, and looked down, and said, ‘We didn’t need them. We had the power.’ He didn’t even look me in the eye. I was flabbergasted.”

Remember, these aren’t hippies from The Nation,, or ACLU attorneys, or even (ahem) wild-eyed Cato libertarians. They’re registered Republicans appalled by the corruption of the intelligence mission to which they’d devoted their professional lives.

Designing an Insecure Internet

If there were any doubt that the 90s are back in style, witness the Obama administration’s attempt to reignite the Crypto Wars by seeking legislation that would force Internet services to redesign their networks and products to provide a centralized mechanism for decrypting user communications. It cannot be stressed enough what a radical—and terrible—idea this is.  I’ll be writing on this at greater length this week, but a few quick points.

First, while the Communications Assistance for Law Enforcement Act (CALEA) already requires phone and broadband providers to build in interception capacity at their network hubs, this proposed requirement—at least going on the basis of the press description, since there’s no legislative text yet—is both broader and more drastic. It appears that it would apply to the whole panoply of online firms offering secure communication services, not just big carriers, imposing a greater relative burden. More importantly, it’s not just mandating that already-centralized systems install a government backdoor. Rather, if I understand it correctly, the proposal would insist on a centralized (and therefore less secure) architecture for secure communications, as opposed to an end-to-end model where encryption is handled client-side. In effect, the government is insisting on the right to make a macro-design choice between competing network models for thousands of companies.

Second, they are basically demanding that providers design their systems for breach. This is massively stupid from a security perspective.  In the summer of 2004, still unknown hackers exploited surveillance software built in to one of Greece’s major cell networks to eavesdrop on high government officials, including the prime ministers. The recent hack of Google believed to originate in China may have used a law-enforcement portal to acquire information about dissidents. More recently, we learned of a Google engineer abusing his access to the system to spy on minors.

Third, this demand has implications beyond the United States. Networks designed for interception by U.S. authorities will also be more easily tapped by authoritarian governments looking to keep tabs on dissidents. And indeed, this proposal echoes demands from the likes of Saudi Arabia and the United Arab Emirates that their Blackberry system be redesigned for easier interception. By joining that chorus, the U.S. makes it more difficult for firms to resist similar demands from unlovely regimes.

Finally, this demand highlights how American law enforcement and intel agencies have been circumventing reporting requirements designed to provide information on this very problem. As the Crypto Wars of the 90s drew to a close, Congress amended the Wiretap Act, which creates strong procedural protections when the government wants to use intrusive electronic surveillance, to add a requirement that agencies report each instance in which they’d encountered encryption.  The idea was to get an objective measure of how serious a problem this posed. The most recent report, however, cited only one instance in which encryption was encountered, out of 2,376 wiretap orders. Why, then, are we now being told encryption is a huge problem? Almost certainly because law enforcement and intelligence agencies aren’t using the Wiretap Act to intercept electronic communications—preferring, instead, to avail themselves of the far more lax standards—and spare reporting requirements—provided by the Stored Communications Act.  It’s always easier to claim you need sweeping new powers from Congress when you’ve managed to do an end-run around the provisions Congress put in place to keep itself informed about how you’re using your existing powers, after all.

State Secrets, Courts, and NSA’s Illegal Wiretapping

As Tim Lynch notes, Judge Vaughn Walker has ruled in favor of the now-defunct Al-Haramain Islamic Foundation—unique among the many litigants who have tried to challenge the Bush-era program of warrantless wiretapping by the National Security Agency because they actually had evidence, in the form of a document accidentally delivered to foundation lawyers by the government itself, that their personnel had been targeted for eavesdropping.

Other efforts to get a court to review the program’s legality had been caught in a kind of catch-22: Plaintiffs who merely feared that their calls might be subject to NSA filtering and interception lacked standing to sue, because they couldn’t show a specific, concrete injury resulting from the program.

But, of course, information about exactly who has been wiretapped is a closely guarded state secret. So closely guarded, in fact, that the Justice Department was able to force the return of the document that exposed the wiretapping of Al-Haramain, and then get it barred from the court’s consideration as a “secret” even after it had been disclosed. (Contrast, incidentally, the Supreme Court’s jurisprudence on individual privacy rights, which often denies any legitimate expectation of privacy in information once revealed to a third party.) Al-Haramain finally prevailed because they were ultimately able to assemble evidence from the public record showing they’d been wiretapped, and the government declined to produce anything resembling a warrant for that surveillance.

If you read over the actual opinion, however it may seem a little anticlimactic—as though something is missing. The ruling concludes that there’s prima facie evidence that Al-Haramain and their lawyers were wiretapped, that the government has failed to produce a warrant, and that this violates the Foreign Intelligence Surveillance Act. But of course, there was never any question about that. Not even the most strident apologists for the NSA program denied that it contravened FISA; rather, they offered a series of rationalizations for why the president was entitled to disregard a federal statute.

There was the John Yoo argument that the president essentially becomes omnipotent during wartime, and that if we can shoot Taliban on a foreign battlefield, surely we can wiretap Americans at home if they seem vaguely Taliban-ish. Even under Bush, the Office of Legal Counsel soon backed away from such… creative… lines of argument. Instead, they relied on the post-9/11 Authorization for the Use of Military Force (AUMF) against al-Qaeda, claiming it had implicitly created a loophole in the FISA law. It was David Kris, now head of DOJ’s National Security Division, who most decisively blew that one out of the water, concluding that it was “essentially impossible” to sustain the government’s reading of the AUMF.

Yet you’ll note that none of these issues arise in Walker’s opinion, because the DOJ, in effect, refused to play. They resisted the court at every step, insisting that a program discussed at length on the front pages of newspapers for years now was so very secret that no aspect of it could be discussed even in a closed setting. They continued to insist on this in the face of repeated court rulings to the contrary. So while Al-Haramain has prevailed, there’s no ruling on the validity of any of those arguments. That’s why I think Marcy Wheeler is probably correct when she predicts that the government will simply take its lumps and pay damages rather than risk an appeal. For one, while Obama administration has been happy to invoke state secrecy as vigorously as its predecessor, it would obviously be somewhat embarrassing for Obama’s DOJ to parrot Bush’s substantive claims of near-limitless executive power. Perhaps more to the point, though, some of those legal arguments may still be operative in secret OLC memos. The FISA Amendments Act aimed to put the unlawful Bush program under court supervision, and even reasserted FISA’s language establishing it as the “exclusive means” for electronic surveillance, which would seem to drive a final stake in the heart of any argument based on the AUMF. But we ultimately don’t know what legal rationales they still consider operative, and it would surely be awkward to have an appellate court knock the legs out from under some of these secret memoranda.

None of this is to deny that the ruling is a big deal—if nothing else because it suggests that the government does not enjoy total carte blanche to shield lawbreaking from review with broad, bald assertions of privilege. But I also know that civil libertarians had hoped that the courts might be the only path to a more full accounting of—and accountability for—the domestic spying program. If the upshot of this is simply that the government must pay a few tens, or even hundreds of thousands of dollars in damages, it’s hard not to see the victory as something of a disappointment.

The Government Can Monitor Your Location All Day Every Day Without Implicating Your Fourth Amendment Rights

If you have a mobile phone, that’s the upshot of an argument being put forward by the government in a case being argued before the Third Circuit Court of Appeals tomorrow. The case is called In the Matter of the Application of the United States of America For An Order Directing A Provider of Electronic Communication Service To Disclose Records to the Government.

Declan McCullagh reports:

In that case, the Obama administration has argued that Americans enjoy no “reasonable expectation of privacy” in their—or at least their cell phones’—whereabouts. U.S. Department of Justice lawyers say that “a customer’s Fourth Amendment rights are not violated when the phone company reveals to the government its own records” that show where a mobile device placed and received calls.

The government can maintain this position because of the retrograde “third party doctrine.” That doctrine arose from a pair of cases in the early 1970s in which the Supreme Court found no Fourth Amendment problems when the government required service providers to maintain records about their customers, and later required those service providers to hand the records over to the government.

I wrote about these cases, and the courts’ misunderstanding of privacy since 1967’s Katz decision, in an American University Law Review article titled “Reforming Fourth Amendment Privacy Doctrine”:

These holdings were never right, but they grow more wrong with each step forward in modern, connected living. Incredibly deep reservoirs of information are constantly collected by third-party service providers today. Cellular telephone networks pinpoint customers’ locations throughout the day through the movement of their phones. Internet service providers maintain copies of huge swaths of the information that crosses their networks, tied to customer identifiers. Search engines maintain logs of searches that can be correlated to specific computers and usually the individuals that use them. Payment systems record each instance of commerce, and the time and place it occurred. The totality of these records are very, very revealing of people’s lives. They are a window onto each individual’s spiritual nature, feelings, and intellect. They reflect each American’s beliefs, thoughts, emotions, and sensations. They ought to be protected, as they are the modern iteration of our “papers and effects.”

This is a case to watch, as it will help determine whether or not your digital life is an open book to government investigators.

Three Keys to Surveillance Success: Location, Location, Location

The invaluable Chris Soghoian has posted some illuminating—and sobering—information on the scope of surveillance being carried out with the assistance of telecommunications providers.  The entire panel discussion from this year’s ISS World surveillance conference is well worth listening to in full, but surely the most striking item is a direct quotation from Sprint’s head of electronic surveillance:

[M]y major concern is the volume of requests. We have a lot of things that are automated but that’s just scratching the surface. One of the things, like with our GPS tool. We turned it on the web interface for law enforcement about one year ago last month, and we just passed 8 million requests. So there is no way on earth my team could have handled 8 million requests from law enforcement, just for GPS alone. So the tool has just really caught on fire with law enforcement. They also love that it is extremely inexpensive to operate and easy, so, just the sheer volume of requests they anticipate us automating other features, and I just don’t know how we’ll handle the millions and millions of requests that are going to come in.

To be clear, that doesn’t mean they are giving law enforcement geolocation data on 8 million people. He’s talking about the wonderful automated backend Sprint runs for law enforcement, LSite, which allows investigators to rapidly retrieve information directly, without the burden of having to get a human being to respond to every specific request for data.  Rather, says Sprint, each of those 8 million requests represents a time when an FBI computer or agent pulled up a target’s location data using their portal or API. (I don’t think you can Tweet subpoenas yet.)  For an investigation whose targets are under ongoing realtime surveillance over a period of weeks or months, that could very well add up to hundreds or thousands of requests for a few individuals. So those 8 million data requests, according to a Sprint representative in the comments, actually “only” represent “several thousand” discrete cases.

As Kevin Bankston argues, that’s not entirely comforting. The Justice Department, Soghoian points out, is badly delinquent in reporting on its use of pen/trap orders, which are generally used to track communications routing information like phone numbers and IP addresses, but are likely to be increasingly used for location tracking. And recent changes in the law may have made it easier for intelligence agencies to turn cell phones into tracking devices.  In the criminal context, the legal process for getting geolocation information depends on a variety of things—different districts have come up with different standards, and it matters whether investigators want historical records about a subject or ongoing access to location info in real time. Some courts have ruled that a full-blown warrant is required in some circumstances, in other cases a “hybrid” order consisting of a pen/trap order and a 2703(d) order. But a passage from an Inspector General’s report suggests that the 2005 PATRIOT reauthorization may have made it easier to obtain location data:

After passage of the Reauthorization Act on March 9, 2006, combination orders became unnecessary for subscriber information and [REDACTED PHRASE]. Section 128 of the Reauthorization Act amended the FISA statute to authorize subscriber information to be provided in response to a pen register/trap and trace order. Therefore, combination orders for subscriber information were no longer necessary. In addition, OIPR determined that substantive amendments to the statute undermined the legal basis for which OIPR had received authorization [REDACTED PHRASE] from the FISA Court. Therefore, OIPR decided not to request [REDACTED PHRASE] pursuant to Section 215 until it re-briefed the issue for the FISA Court. As a result, in 2006 combination orders were submitted to the FISA Court only from January 1, 2006, through March 8, 2006.

The new statutory language permits FISA pen/traps to get more information than is allowed under a traditional criminal pen/trap, with a lower standard of review, including “any temporarily assigned network address or associated routing or transmission information.” Bear in mind that it would have made sense to rely on a 215 order only if the information sought was more extensive than what could be obtained using a National Security Letter, which requires no judicial approval. That makes it quite likely that it’s become legally easier to transform a cell phone into a tracking device even as providers are making it point-and-click simple to log into their servers and submit automated location queries.  So it’s become much more  urgent that the Justice Department start living up to its obligation to start telling us how often they’re using these souped-up pen/traps, and how many people are affected.  In congressional debates, pen/trap orders are invariably mischaracterized as minimally intrusive, providing little more than the list of times and phone numbers they produced 30 years ago.  If they’re turning into a plug-and-play solution for lojacking the population, Americans ought to know about it.

If you’re interested enough in this stuff to have made it through that discussion, incidentally, come check out our debate at Cato this afternoon, either in the flesh or via webcast. There will be a simultaneous “tweetchat” hosted by the folks at Get FISA Right.

Fusion Centers

Most people don’t care about government surveillance – just so long as they are not affected by it.  We want the police to be on lookout for trouble – so some surveillance is necessary for the work they do.  But how much?

After 9/11, state officials said they had difficulty “connecting all the dots.”  Fusion centers are supposed to remedy that problem.  Police departments around the country are creating databases (“fusion centers”) and the objective is to link them together so that the police can spot patterns of behavior so that crimes or terrorist attacks can be thwarted.

The goal seems sensible and worthwhile but as the details emerge on how fusion centers operate, the concept gets controversial fast.  Who will be monitored? What kind of information will be  collected?   And who decides when pieces of information should be discarded or entered into a massive database?  If false information about, say, YOU, goes into the database, will you ever learn about it?  Have an opportunity to erase it or correct it?

Fusion centers are springing up all over the country and they are coordinating the efforts of some 800,000 American law enforcement officers to collect information about anyone deemed suspicious. One problem is that terrorists are not of a monolithic character. Terrorists can be extremely religious or secular; they may be Arab, white, black or any other race; terrorists come from both rich and poor backgrounds; they come from the far right, the far left – and some are simply against society generally. And when criminals are added to the mix, the potential dragnet for this casual government surveillance potentially covers scores of people.

Behaviors that make someone eligible for government monitoring are quite broad. As noted by Bruce Fein in his testimony before Congress in April, citing a July 2008 ACLU report on fusion centers, such suspicious behaviors in one LAPD directive include “using binoculars,” “taking pictures or video footage “with no apparent aesthetic value,” “drawing diagrams,” and “taking notes,” among others.

Former vice-president Cheney might argue that the monitoring is not extensive enough.  He recently said (pdf): “When just a single clue goes unlearned … can bring on a catastrophe – it’s no time for splitting differences.  There is never a good time to compromise when the lives and safety of the American people are in the balance.”  National security, it seems,  requires that we get everyone into the central database for scrutiny.  We can’t afford any ”gaps” in the surveillance matrix.

I will be moderating a Cato event about fusion centers on Thursday, June 11, at noon.  The panel will include attorney Bruce Fein, the ACLU’s Mike German (who co-authored the report linked above), and Harvey Eisenberg, Chief of the National Security Section in the Maryland Division of the U.S. Attorney’s office.