Tag: privacy

How Not to Think About Drone Policy

Today, The Oklahoman published an editorial that serves as a good example of how not to think about drone policy. According to The Oklahoman editorial board, a proposed drone weaponization ban was a solution in search of a problem, and concerns regarding privacy are based on unjustifiable fears. This attitude ignores the state of drone technology and disregards the fact that drones should prompt us to re-think privacy protections.

Weaponized drones are often thought of as tools of foreign policy, but technological advances mean that Americans should be keeping an eye out for armed drones on the home front. Yet, in the pages of The Oklahoman readers will find the following:

we know of no instance where Oklahoma law enforcement officers have used drones to shoot someone without justification. To ban the police from using weaponized drones appears a solution in search of a problem.

I’m not aware of police in Oklahoma using drones to shoot someone with justification, but that’s beside my main point. Oklahoman lawmakers shouldn’t have to wait for a citizen to be shot by a weaponized drone before considering regulations. It would be premature for legislators to consider teleport regulations or artificial intelligence citizenship bills. But weaponized drones are no longer reserved to the imagination of science fiction writers. They’re here.

Continuing Resolution to Fund the National ID

If as expected Congress passes a continuing resolution in coming weeks to fund the government into December, take note of how neatly our elected officials are side-stepping responsibility for government spending. The votes that should have come in the summer ahead of the election, giving them some electoral salience, will happen in December, after you’ve made your “choice.”

But let’s home in on another way that the failed appropriations process undercuts fiscal rectitude and freedom. A “CR” will almost certainly continue funding for implementation of the REAL ID Act, the federal national ID program.

From 2008 to 2011, direct funding for REAL ID was included in the DHS appropriations bills, typically at the level of $50 million per fiscal year. That process was evidently too transparent, so from 2011 on appropriators have folded REAL ID funding into the “State Homeland Security Grant Program” (SHSGP). That’s a $400 million discretionary fund. Combining the SHSGP with other funds, there’s a nearly $700 million pool of money for DHS to tap into in order to build a national ID.

Cities Seek Police Surveillance Transparency and Oversight

Today, legislative efforts began in eleven cities (see right) aimed at requiring police departments to be more transparent about the surveillance technology they use. The bills will also reportedly propose increased community control over the use of surveillance tools. These efforts, spearheaded by the ACLU and other civil liberty organizations, are important at a time when surveillance technology is improving and is sometimes used without the knowledge or approval of local officials or the public.

Many readers will be familiar with CCTV cameras and wiretap technology, which police use to investigate crimes and gather evidence. Yet there is a wide range of surveillance tools that are less well-known and will become more intrusive as technology advances.

Facial recognition software is already used by some police departments. As this technology improves it will be easier for police to identify citizens, especially if it is used in conjunction with body cameras. But our faces are not our only biometric identifiers. Technology in the near future will make it easier to identify us by analyzing our gait, voice, irises, and ears.

Using the Income Tax to Map Our Lives

And it came to pass in those days, that there went out a decree that all the world should be taxed.

And lo, the ubiquity of taxation made it possible for the Treasury Department to identify all the same-sex marriages in the land by zip code and present the data in tables and a map.

And in all the land only a few paranoids worried about the implications for privacy and freedom, of gay people and others, of a government that knows everything about you.

Feinstein-Burr 2.0: The Crypto Backdoor Bill Is Still Alive

When it was first released back in April, a “discussion draft” of the Compliance With Court Orders Act sponsored by Sens. Dianne Feinstein (D-CA) and Richard Burr (R-NC) met with near universal derision from privacy advocates and security experts. (Your humble author was among the critics.) In the wake of that chilly reception, press reports were declaring the bill effectively dead just weeks later, even as law enforcement and intelligence officials insisted they would continue pressing for a solution to the putative “going dark” problem that encryption creates for government eavesdroppers.  Feinstein and Burr, however, appear not to have given up on their baby: Their offices have been circulating a revised draft, which I’ve recently gotten hold of.

To protect my source’s anonymity, I won’t post the document itself, but it’s easy enough to summarize. The 2.0 version is mostly identical to the original version, with four main changes:

(1) Narrower scope

The original draft required a “covered entity” to render encrypted data “intelligible” to government agents bearing a court order if the data had been rendered unintelligible “by a feature, product, or service owned, controlled, created, or provided, by the covered entity or by a third party on behalf of the covered entity.” The new version deletes “owned,” “created,” and “provided”—so the primary mandate now applies only to a person or company that “controls” the encryption process.

(2)  Limitation to law enforcement

The revised version eliminates section (B) under the bill’s definition of “court order,” which obligated recipients to comply with decryption orders issued for investigations related to “foreign intelligence, espionage, and terrorism.”  The bill is now strictly about law enforcement  investigations into a variety of serious crimes, including federal drug crimes and their state equivalents.

 (3) Exclusion of critical infrastructure

The Weird World of Data (and Your Privacy)

In 2007, Judge Richard Posner found it “untenable” that attaching a tracking device to a car is a seizure. But the Supreme Court struck down warrantless attachment of a GPS device to a car on that basis in 2012. Putting a tracking device on a car makes use of it without the owner’s permission, and it deprives the owner of the right to exclude others from the car.

The weird world of data requires us to recognize seizures when government agents take any of our property rights, including the right to use and the right to exclude others. There’s more to property than the right to possession.

In an amicus brief filed with the U.S. Court of Appeals for the D.C. Circuit last week, we argued for Fourth Amendment protection of property rights in data. Recognition of such rights is essential if the protections of the Fourth Amendment are going to make it into the Information Age.

The case arises because the government seized data about the movements of a criminal suspect from his cell phone provider. The government argues that it can do so under the Stored Communications Act, which requires the government to provide “specific and articulable facts showing that there are reasonable grounds to believe that [data] are relevant and material to an ongoing criminal investigation.” That’s a lower standard than the probable cause standard of the Fourth Amendment.

As we all do, the defendant had a contract with his cell phone provider that required it to share data with others only based on “lawful” or “valid” legal processes. The better reading of that industry-standard contract language is that it gives telecom customers their full right to exclude others from data about them. If you want to take data about us that telecom companies hold for us under contract, you have to get a warrant.

Understanding U.S. v. Ackerman

The Supreme Court has eschewed the “reasonable expectation of privacy” test in its most important recent Fourth Amendment cases. It’s not certain that the trend away from the so-called “Katz test,” largely driven by Justice Scalia, will continue, and nobody knows what will replace it. But doctrinal shift is in the air. Courts are searching for new and better ways to administer the Fourth Amendment.

A good example is the Tenth Circuit’s decision last week in U.S. v. Ackerman. That court found that opening an email file was a Fourth Amendment “search,” both as a matter of reasonable expectations doctrine and the “distinct line of authority” that is emerging from the Supreme Court’s 2012 decision in U.S. v. Jones.

Here are the facts: AOL scans outgoing emails for child porn by comparing hashes of files sent through its network to hashes of known child porn. When it becomes aware of child porn, it is required by law to report them to the National Center for Missing and Exploited Children. NCMEC is a governmental entity and agent. (That point takes up the bulk of the decision; Congress has made huge grants of governmental power to the organization.) NCMEC opened the file without a warrant.