Tag: Gmail

E-Mail Privacy Laws Don’t Actually Protect Modern E-mail, Court Rules

In case further proof were needed that we’re long overdue for an update of our digital privacy laws, the South Carolina Supreme Court has just ruled that e-mails stored remotely by a provider like Yahoo! or Gmail are not communications in “electronic storage” for the purposes of the Stored Communications Act, and therefore not entitled to the heightened protections of that statute.

There are, fortunately, other statutes barring unauthorized access to people’s accounts, and one appellate court has ruled that e-mail is at least sometimes protected from government intrusion by the Fourth Amendment, independently of what any statute says. But given the variety of different types of electronic communication services that exist in 2012, nobody should feel too confident that the courts will be prepared to generalize that logic. It is depressingly easy, for example, to imagine a court ruling that users of a service like Gmail, whose letters will be scanned by Google’s computers to automatically deliver tailored advertisements, have therefore waived the “reasonable expectation of privacy” that confers Fourth Amendment protection. Indeed, the Justice Department has consistently opposed proposals to clearly require a warrant for scrutinizing electronic communications, arguing that it should often be able to snoop through citizens’ digital correspondence based on a mere subpoena or a showing of “relevance” to a court.

The critical passage at issue in this case—which involves private rather than governmental snooping—is the definition of “electronic storage,” which covers “temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof” as well as “any storage of such communication by an electronic communication service for the purposes of backup protection of such communication.” The justices all agreed that the e-mails were not in “temporary, intermediate” storage because the legitimate recipient had already read them. They also agreed—though for a variety of reasons—that the e-mails were not in “backup” storage.

Some took this view on the grounds that storage “by an electronic communication service for the purposes of backup protection” encompasses only separate backups created by the  provider for their own purposes, and not copies merely left remotely stored in the user’s inbox. This strikes me as a somewhat artificial distinction: why do the providers create backups? Well, to ensure that they can make the data available to the end user in the event of a crash. The copy is kept for the user’s ultimate benefit either way. One apparent consequence of this view is that it would make a big difference if read e-mails were automatically “deleted” and moved to a “backup” folder, even though this would be an essentially cosmetic alteration to the interface.

Others argued that a “backup” presumed the existence of another, primary copy and noted there was no evidence the user had “downloaded” and retained copies of the e-mails in question. This view rests on a simple technical confusion. If you have read your Gmail on your home computer or mobile device, then of course a copy of that e-mail has been downloaded to your device—otherwise you couldn’t be reading it. This is obscured by the way we usually talk: we say we’re reading something “on Google’s website”—as though we’ve somehow traveled on the Web to visit another location where we’re viewing the information. But this is, of course, just a figure of speech: what you’re actually reading is a copy of the data from the remote server, now residing on your own device. Moreover, it can’t be necessary for the user to retain that copy, since that would rather defeat the purpose of making a “backup,” which is to guarantee that you still have access to your data after it has been deleted from your main device! The only time you actually need a backup is when you don’t still retain a copy of the data elsewhere.

Still, this isn’t really the court’s fault. Whether or not this interpretation makes sense, it at least arguably does reflect what Congress intended when the Stored Communications Act was passed back in 1986, when there was no such thing as Webmail, when storage space was expensive, and when everyone assumed e-mail would generally vanish from the user’s remote inbox upon download. The real problem is that we’ve got electronic privacy laws that date to 1986, and as a result makes all sorts of distinctions that are nonsensical in the modern context of routine cloud storage. Legislation to drag the statute into the 21st century has been introduced, but alas, there’s little indication Congress is in much of a rush to get it passed.

Want Privacy Choice? Papa’s Gonna Give You One

I was interested by the title of a paper called “Behavioral Advertising: The Offer You Cannot Refuse” by a small coterie of privacy activist/researchers. I love the Godfather movies, in which the statement, “I’m going to make him an offer he can’t refuse,” is a coolly tuxedoed plan to threaten someone with violence or death. I don’t love the paper’s attempt to show that government “interventions” are superior to markets in terms of freedom.

Behavioral advertisers are no mafiosi. They are not in the business of illegal coercion. They’re not in any kind of illegal business, in fact. The choice of title suggests that the authors may be biased toward making targeted advertising illegal. (The lead author argued in 2004 that Gmail should be shut down as a violation of California law.)

What was most interesting, though, was the paper’s unspoken battle with lock-in, or path dependence. That’s the idea in technology development that a given state of affairs perpetuates itself due to the costs of changing course.

The QWERTY keyboard is a famous example of lock-in. The story with QWERTY is that keys on early mechanical typewriters were arranged so that commonly used letters wouldn’t strike one another and jam together. The result was an inefficient arrangement of keys for the fingers, but it’s an arrangement that has stuck.

The reason why it has stuck is because of switching costs. Everybody who knows how to type knows how to type on a QWERTY keyboard. If you wanted to change to a more efficient keyboard, you’d have to change every keyboard and everyone’s training. That’s a huge cost to pay in exchange for a modest increase in efficiency. So we’ve got QWERTY.

Since as close to the beginning as I know of, Web browsers have been designed to store information delivered by Web sites and to return it to those sites. Cookies are the best known form of this, tiny files that allow a Web site to recognize the browser (inferentially, the user) and deliver custom content. There are also “Flash cookies,” more accurately called “local shared objects,” which can store information about users’ preferences, such as volume settings for Internet videos. Flash cookies can also be used to store unique identifiers to use in tracking. These things provide value to Internet users, and most Web sites make use of them to deliver better content.

The authors of the paper don’t like that. The path of Web browsing technology is not privacy protective, and they would call on regulators to fix that with a pair of interventions: preventing Flash cookies from “respawning” cookies (that is, recreating them when they have been deleted) and regulation of consumer-data markets to prevent marketers from learning information about consumers. This would uphold consumer choice, they argue. And they argue dubiously that their work “inverts the assumption that privacy interventions are paternalistic while market approaches promote freedom.”

Now, ask yourself: If the government came in and required everyone to train for and use the more efficient Dvorak keyboard, would that be a paternalistic step? The end result would be more efficient typing.

Of course it would be paternalistic.

So let’s be frank. This is an argument for paternalistic intervention, attempting to allay the authors’ concern about what a favorite technology of Internet users is doing to privacy.

And it is the authors’ privacy concerns, not Internet users’ at large. Opinion surveys in the privacy area are notorious for revealing that consumers will state a preference for privacy no matter what their true interests are.

The good news is that there is far less lock-in in the Internet browser area than in keyboards. Technologists can and do build browser modifications that prevent tracking of the type this article is concerned with.

Their real problem is that few people actually care as much as the authors do about whether or not they receive tailored advertisements. Few people want to use a browser that is essentially crippled to gain a sliver of privacy protection to which they are indifferent.

Paternalist? It sure is. And unlike a paternally driven switch to a better keyboard, this policy wouldn’t obviously make consumers better off.