Tag: cookies

What They Know Is Interesting—-But What Are You Going to Do About It?

The Wall Street Journal has stirred up a discussion of online privacy with its “What They Know” series of reports. These reports reveal again the existence and some workings of the information economy behind the Internet and World Wide Web. (All that content didn’t put itself there, y’know!)

The discussion centers around “tracking” of web users, particularly through the use of “cookies.” Cookies are little text files that web sites offer your browser when you visit. If your browser accepts the cookie, it will share the content of the text file back with that domain when you visit it a second time.

Often cookies have distinct strings of characters in them, so the site can recognize you. Sites use cookies to customize your experience. If you voted on a poll, for example, a cookie will cause the site to tell you how you voted. Cookies enable the “shopping cart” function in online stores.

Advertising networks use cookies to gather information about web surfers. Ads are embedded on the main sites people visit, just like the video above and the Amazon Kindle widget in the column on the right. They’re served by different servers than most of the content on the page. Embedded content acts as a sort of  ”third party” to the main transaction between web surfers and the sites they visit. Embedded content can offer cookies just like main sites do—they’re known as “third-party cookies.” 

A network that has ads on a lot of sites will recognize a browser (and by inference the person using it) when it goes to different web sites, enabling the ad network to get a sense of that person’s interests. Been on a site dealing with SUVs? You just might see an SUV ad as you continue to surf.

This is important to note: Most web sites and ad networks do not “sell” information about their users. In targeted online advertising, the business model is to sell space to advertisers—giving them access to people (“eyeballs”) based on their demographics and interests. It is not to sell individuals’ personal and contact info. Doing the latter would undercut the advertising business model and the profitability of the web sites carrying the advertising.

Some people don’t like this tracking. I think some feel it undignified to be a mere object of impersonal commerce (see Seger, Bob). Some worry that data about their interests will be used to discriminate wrongly against them, or to exclude them from information and opportunities they should enjoy. Excess customization of the web experience may stratify society, some believe. Tied to real identities, this data could fall into the hands of government and be used wrongly. These are all legitimate concerns, and I share some of them more, and some less, than others.

One I understand but dislike is the offense some people take at cookies for their “surreptitious” use. How many decades must cookies be integral to web browsing, and how many waves of public debate must their be about cookies before they lose their surreptitious cast? Cookies are just as surreptitious as photons and sound waves, which silently and invisibly carry data about you to anyone in the vicinity. We’d all be in a pretty tough spot without them.

Though cookies—and debate about their privacy consequences—have been around for a long time, many people don’t know even the basics I laid out above. They also don’t know that cookies are within the control of every web user.

As I testified to the Senate Commerce Committee last week, In the major browsers (Firefox and Internet Explorer), one must simply go to the “Tools” pull-down menu, select “Options,” then click on the “Privacy” tab to customize one’s cookie settings. In Firefox, one can decline to accept all third-party cookies, neutering the cookie-based data collection done by ad networks. In Internet Explorer, one can block all cookies, block all third-party cookies, or even choose to be prompted each time a cookie is offered.

Yes, new technologies make cookie control an imperfect protection against tracking, but that does not excuse consumers from the responsibility to exercise privacy self-help that will get at the bulk of the problem.

Some legislators, privacy advocates, and technologists want very badly to protect consumers, but much of what is called ”consumer protection” actually functions as an invitation for consumers to cede personal responsibility. People rise or fall to meet expectations, and consumer advocates who assume incompetence on the part of the public may have a hand in producing it, making consumers worse off. 

If a central authority such as Congress or the Federal Trade Commission were to decide for consumers how to deal with cookies, it would generalize wrongly about many, if not most, individuals’ interests, giving them the wrong mix of privacy and interactivity, for example. And it would leave consumers unprotected from threats beyond their jurisdiction (i.e. web tracking by sites outside the United States). Education is the hard way, and it is the only way, to get consumers’ privacy interests balanced with their other interests.

But perhaps this is a government vs. corporate passion play, with government as the privacy defender (… oh, nevermind). One article in the WSJ series has interacted with lasting anti-Microsoft sentiment to produce interpretations that business interests are working to undercut consumer privacy. Engineers working on a new version of Microsoft’s Internet Explorer browser thought they might set certain defaults to protect privacy better, but they were overruled when the business segments at Microsoft learned of the plan. Privacy “sabotage,” the Electronic Frontier Foundation called it. And a Wired news story says Microsoft “crippled” online privacy protections.

But if the engineers’ plan had won the day, an equal opposite reaction would have resulted when Microsoft “sabotaged” web interactivity and the advertising business model, “crippling” consumer access to free content. The new version of Microsoft’s browser maintained the status quo in cookie functionality, as does Google’s Chrome browser and Firefox, a product of non-profit privacy “saboteur” the Mozilla Foundation. The “business attacks privacy” story doesn’t wash.

This is not to say that businesses don’t want personal information—they do, so they can provide maximal service to their customers. But they are struggling to figure out how to serve all dimensions of consumer interest including the internally inconsistent consumer demand for privacy along with free content, custom web experiences, convenience, and so on.

Only one thing is certain here: Nobody knows how this is supposed to come out. Cookies and other tracking technologies will create legitimate concerns that weigh against the benefits they provide. Browser defaults may converge on something more privacy protective. (Apple’s Safari browser rejects third-party cookies unless users tell it to do otherwise.) Browser plug-ins will augment consumers’ power to control cookies and other tracking technologies. Consumers will get better accustomed to the information economy, and they will choose more articulately how they fit into it. 

What matters is that the conversation should continue. If you’ve read this far, you’re better equipped to participate in it, and to take responsibility for your own privacy.

Do so.

Nor Does Tech Get D.C… .

Politico has a pretty thorough article on D.C.’s thorough ignorance of things tech.

Take a 2008 hearing before the Senate Commerce Committee about privacy and online behavior-based advertising. The discussion seemed to fall apart when Sens. Tom Carper (D-Del.), Bill Nelson (D-Fla.) and others seemed not to understand the term “cookies.”

Cookies. That’s the (utterly rudimentary) technology that was an issue a decade ago. Washington, D.C. naturally overreacted, but luckily only harmed itself. The White House recently revamped the cookie policy for federal government web sites.

It’s worth noting Tech’s thorough misapprehension of Washington, D.C. as well. Judging by how they act, most tech executives have all the insight they could pick up from Schoolhouse Rock. It seems cool and helpful to come to Washington and give money, so they do, encouraging the bears to rip open their cars looking for peanut butter.

Picture Don Draper Stamping on a Human Face, Forever

Last week, a coalition of 10 privacy and consumer groups sent letters to Congress advocating legislation to regulate behavioral tracking and advertising, a phrase that actually describes a broad range of practices used by online marketers to monitor and profile Web users for the purpose of delivering targeted ads. While several friends at the Tech Liberation Front have already weighed in on the proposal in broad terms – in a nutshell: they don’t like it – I think it’s worth taking a look at some of the specific concerns raised and remedies proposed. Some of the former strike me as being more serious than the TLF folks allow, but many of the latter seem conspicuously ill-tailored to their ends.

First, while it’s certainly true that there are privacy advocates who seem incapable of grasping that not all rational people place an equally high premium on anonymity, it strikes me as unduly dismissive to suggest, as Berin Szoka does, that it’s inherently elitist or condescending to question whether most users are making informed choices about their privacy. If you’re a reasonably tech-savvy reader, you probably know something about conventional browser cookies, how they can be used by advertisers to create a trail of your travels across the Internet, and how you can limit this.  But how much do you know about Flash cookies? Did you know about the old CSS hack I can use to infer the contents of your browser history even without tracking cookies? And that’s without getting really tricksy. If you knew all those things, congratulations, you’re an enormous geek too – but normal people don’t.  And indeed, polls suggest that people generally hold a variety of false beliefs about common online commercial privacy practices.  Proof, you might say, that people just don’t care that much about privacy or they’d be attending more scrupulously to Web privacy policies – except this turns out to impose a significant economic cost in itself.

The truth is, if we were dealing with a frictionless Coaseian market of fully-informed users, regulation would not be necessary, but it would not be especially harmful either, because users who currently allow themselves to be tracked would all gladly opt in. In the real world, though, behavioral economics suggests that defaults matter quite a lot: Making informed privacy choices can be costly, and while an opt-out regime will probably yield tracking of some who would prefer not to be under conditions of full information and frictionless choice, an opt-in regime will likely prevent tracking of folks who don’t object to tracking. And preventing that tracking also has real social costs, as Berin and Adam Thierer have taken pains to point out. In particular, it merits emphasis that behavioral advertising is regarded by many as providing a viable business model for online journalism, where contextual advertising tends not to work very well: There aren’t a lot of obvious products to tie in to an important investigative story about municipal corruption. Either way, though, the outcome is shaped by the default rule about the level of monitoring users are presumed to consent to. So which set of defaults ought we to prefer?

Here’s why I still come down mostly on Adam and Berin’s side, and against many of the regulatory remedies proposed. At the risk of stating the obvious, users start with de facto control of their data. Slightly less obvious: While users will tend to have heterogeneous privacy preferences – that’s why setting defaults either way is tricky – individual users will often have fairly homogeneous preferences across many different sites. Now, it seems to be an implicit premise of the argument for regulation that the friction involved in making lots of individual site-by-site choices about privacy will yield oversharing. But the same logic cuts in both directions: Transactional friction can block efficient departures from a high-privacy default as well. Even a default that optimally reflects the median user’s preferences or reasonable expectations is going to flub it for the outliers. If the variance in preferences is substantial, and if different defaults entail different levels of transactional friction, nailing the default is going to be less important than choosing the rule that keeps friction lowest. Given that most people do most of their Web surfing on a relatively small number of machines, this makes the browser a much more attractive locus of control. In terms of a practical effect on privacy, the coalition members would probably achieve more by persuading Firefox to set their browser to reject third-party cookies out of the box than from any legislation they’re likely to get – and indeed, it would probably have a more devastating effect on the behavioral ad market. Less bluntly, browsers could include a startup option that asks users whether they want to import an exclusion list maintained by their favorite force for good.

On the model proposed by the coalition, individuals have to make affirmative decisions about what data collection to permit for each Web site or ad network at least once every three months, and maybe each time they clear their cookies. If you think almost everyone would, if fully informed, opt out of such collection, this might make sense. But if you take the social benefits of behavioral targeting seriously, this scheme seems likely to block a lot of efficient sharing. Browser-based controls can still be a bit much for the novice user to grapple with, but programmers seem to be getting better and better at making it more easy and automatic for users to set privacy-protective defaults. If the problem with the unregulated market is supposed to be excessive transaction costs, it seems strange to lock in a model that keeps those costs high even as browser developers are finding ways to streamline that process. It’s also worth considering whether such rules wouldn’t have the perverse consequence of encouraging consolidation across behavioral trackers. The higher the bar is set for consent to monitoring, the more that consent effectively becomes a network good, which may encourage concentration of data in a small number of large trackers – not, presumably, the result privacy advocates are looking for. Finally – and for me this may be the dispositive point – it’s worth remembering that while American law is constrained by national borders, the Internet is not. And it seems to me that there’s a very real danger of giving the least savvy users a false sense of security – the government is on the job guarding my privacy! no need to bother learning about cookies! – when they may routinely and unwittingly be interacting with sites beyond the reach of domestic regulations.

There are similar practical difficulties with the proposal that users be granted a right of access to behavioral tracking data about them.  Here’s the dilemma: Any requirement that trackers make such data available to users is a potential security breach, which increases the chances of sensitive data falling into the wrong hands. I may trust a site or ad network to store this information for the purpose of serving me ads and providing me with free services, but I certainly don’t want anyone who sends them an e-mail with my IP address to have access to it. The obvious solution is for them to have procedures for verifying the identity of each tracked user – but this would appear to require that they store still more information about me in order to render tracking data personally identifiable and verifiable. A few ways of managing the difficulty spring to mind, but most defer rather than resolve the problem, and add further points of potential breach.

That doesn’t mean there’s no place for government or policy change here, but it’s not always the one the coalition endorses. Let’s look  more closely at some of their specific concerns and see which, if any, are well-suited to policy remedies. Only one really has anything to do with behavioral advertising, and it’s easily the weakest of the bunch. The groups worry that targeted ads – for payday loans, sub-prime mortgages, or snake-oil remedies – could be used to “take advantage of vulnerable consumers.” It’s not clear that this is really a special problem with behavioral ads, however: Similar targeting could surely be accomplished by means of contextual ads, which are delivered via relevant sites, pages, or search terms rather than depending on the personal characteristics or browsing history of the viewer – yet the groups explicitly aver that no new regulation is appropriate for contextual advertising. In any event, since whatever problem exists here is a problem with ads, the appropriate remedy is to focus on deceptive or fraudulent ads, not the particular means of delivery. We already, quite properly, have rules covering dishonest advertising practices.

The same sort of reply works for some of the other concerns, which are all linked in some more specific way to the collection, dissemination, and non-advertising use of information about people and their Web browsing habits. The groups worry, for instance, about “redlining” – the restriction or denial of access to goods, services, loans, or jobs on the basis of traits linked to race, gender, sexual orientation, or some other suspect classification. But as Steve Jobs might say, we’ve got an app for that: It’s already illegal to turn down a loan application on the grounds that the applicant is African American. There’s no special exemption for the case where the applicant’s race was inferred from a Doubleclick profile. But this actually appears to be something of a redlining herring, so to speak: When you get down into the weeds, the actual proposal is to bar any use of data collected for “any credit, employment, insurance, or governmental purpose or for redlining.” This seems excessively broad; it should suffice to say that a targeter “cannot use or disclose information about an individual in a manner that is inconsistent with its published notice.”

Particular methods of tracking may also be covered by current law, and I find it unfortunate that the coalition letter lumps together so many different practices under the catch-all heading of “behavioral tracking.” Most behavioral tracking is either done directly by sites users interact with – as when Amazon uses records of my past purchases to recommend new products I might like – or by third party companies whose ads place browser cookies on user computers. Recently, though, some Internet Service Providers have drawn fire for proposals to use Deep Packet Inspection to provide information about their users’ behavior to advertising partners – proposals thus far scuppered by a combination of user backlash and congressional grumbling. There is at least a colorable argument to be made that this practice would already run afoul of the Electronic Communications Privacy Act, which places strict limits on the circumstances under which telecom providers may intercept or share information about the contents of user communications without explicit permission. ECPA is already seriously overdue for an update, and some clarification on this point would be welcome. If users do wish to consent to such monitoring, that should be their right, but it should not be by means of a blanket authorization in eight-point type on page 27 of a terms-of-service agreement.

Similarly welcome would be some clarification on the status of such behavioral profiles when the government comes calling. It’s an unfortunate legacy of some technologically atavistic Supreme Court rulings that we enjoy very little Fourth Amendment protection against government seizure of private records held by third parties – the dubious rationale being that we lose our “reasonable expectation of privacy” in information we’ve already disclosed to others outside a circle of intimates. While ECPA seeks to restore some protection of that data by statute, we’ve made it increasingly easy in recent years for the government to seek “business records” by administrative subpoena rather than court order. It should not be possible to circumvent ECPA’s protections by acquiring, for instance, records of keyword-sensitive ads served on a user’s Web-based e-mail.

All that said, some of the proposals offered up seem,while perhaps not urgent, less problematic. Requiring some prominent link to a plain-English description of how information is collected and used constitutes a minimal burden on trackers – responsible sites already maintain prominent links to privacy policies anyway – and serves the goal of empowering users to make more informed decisions. I’m also warily sympathetic to the idea of giving privacy policies more enforcement teeth – the wariness stemming from a fear of incentivizing frivolous litigation. Still, the status quo is that sites and ad networks profitably elicit information from users on the basis of stated privacy practices, but often aren’t directly liable to consumers if they flout those promises, unless the consumer can show that the breach of trust resulted in some kind of monetary loss.

Finally, a quick note about one element of the coalition recommendations that neither they nor their opponents seem to have discussed much – the insistence that there be no federal preemption of state privacy law. I assume what’s going on here is that the privacy advocates expect some states to be more protective of privacy than Congress or the FTC would be, and want to encourage that, while libertarians are more concerned with keeping the federal government from getting involved at all. But really, if there’s an issue that was made for federal preemption, this is it.  A country where vendors, advertisers, and consumers on a borderless Internet have to navigate 50 flavors of privacy rules to sell a banner add or an iTunes track does not sound particularly conducive to privacy, commerce, or informed consumer choice.

All Hail the Demise of a Bad Policy!

Well, not actually. Instead, the Washington Post’s headline says “U.S. Web-Tracking Plan Stirs Privacy Fears.” The story is about the reversal of an ill-conceived policy adopted nine years ago to limit the use of cookies on federal Web sites.

A cookie is a short string of text that a server sends a browser when the browser accesses a Web page. Cookies allow servers to recognize returning users so they can serve up customized, relevant content, including tailored ads. Think of a cookie as an eyeball - who do you want to be able to see that you visited a Web site?

Your browser lets you control what happens with the cookies offered by the sites you visit. You can issue a blanket refusal of all cookies, you can accept all cookies, and you can decide which cookies to accept based on who is offering them. Here’s how:

  • Internet Explorer: Tools > Internet Options > “Privacy” tab > “Advanced” button: Select “Override automatic cookie handling” and choose among the options, then hit “OK,” and next “Apply.”

I recommend accepting first-party cookies - offered by the sites you visit - and blocking third-party cookies - offered by the content embedded in those sites, like ad networks. Or ask to be prompted about third-party cookies just to see how many there are on the sites you visit. If you want to block or allow specific sites, select the “Sites” button to do so. If you selected “Prompt” in cookie handling, your choices will populate the “Sites” list.

  • Firefox: Tools > Options > “Privacy” tab: In the “cookies” box, choose among the options, then hit “OK.”

I recommend checking “Accept cookies from sites” and leaving unchecked “Accept third party cookies.” Click the “Exceptions” button to give site-by-site instructions.

Because you can control cookies, a government regulation restricting cookies is needless nannying. It may marginally protect you from government tracking - they have plenty of other methods, both legitimate and illegitimate - but it won’t protect you from tracking by others, including entities who may share data with the government.

The answer to the cookie problem is personal responsibility. Did you skip over the instructions above? The nation’s cookie problem is your fault.

If society lacks awareness of cookies, Microsoft (Internet Explorer), the Mozilla Foundation (Firefox), and producers of other browsers (Apple/Safari, Google/Chrome) might consider building cookie education into new browser downloads and updates. Perhaps they should set privacy-protective defaults. That’s all up to the community of Internet users, publishers, and programmers to decide, using their influence in the marketplace.

Artificially restricting cookies on federal Web sites needlessly hamstrings federal Web sites. When the policy was instituted it threatened to set a precedent for broader regulation of cookie use on the Web. Hopefully, the debate about whether to regulate cookies is over, but further ‘Net nannying is a constant offering of the federal government (and other elitists).

By moving away from the stultifying limitation on federal cookies, the federal government acknowledges that American grown-ups can and should look out for their own privacy.