Beacon Lessons

Share

Last December's controversy over Facebook's Beacon advertising programhas triggered discussion about whether online services are goingtoo far in their use of customer information. The introduction ofBeacon is now widely agreed to have been a misstep on Facebook'spart. But it's important to keep in mind that experimentation isessential to technological progress. Our current intuitions aboutwhich types of information sharing respect users' privacy and whichinvade it are highly sensitive to implementation details and mayalso be driven by misleading analogies to older technologies. Onlyby experimenting with new features, informing their customers ofwhat they're doing, and listening to customer feedback willcompanies learn how to think about privacy in the fast-changingonline world.

Indeed, not all online programs that got negative initialcustomer reactions have turned out to be bad ideas. Cutting-edgeonline offerings often push users outside of their comfort zones,but users sometimes discover that a service is more useful and lessfrightening than they expected. This suggests that it would be amistake to cement our current intuitions about privacy intoprophylactic privacy regulations. Policymakers should give onlinefirms like Facebook maximum flexibility to use customer informationin innovative ways.

Several examples in recent Internet history illustrate thetrade-offs inherent in privacy. Real-world experience can softenuser opposition to new online offerings. Consider cookies, thesmall files a website can deposit on your hard drive. In 1997,one privacy advocate told Timemagazine: "Cookies represent a way of watching consumerswithout their consent, and that is a fairly frighteningphenomenon."

Yet in recent years, cookies have become ubiquitous online, anduser opposition to them has dissipated. One reason is that as theweb has become more familiar, the concept of websites "watchingyou" no longer seems as sinister. Cookies spare users the hassle ofproviding their usernames and passwords to a site on every visit.Some of the more outrageous claims, such as the idea that cookiescollect information about your browsing habits for third parties,are untrue. And browsers have been updated to give users much morefine-grained control over which websites can set cookies on theircomputers.

Another privacy flap that blew over quickly was the 2004introduction of GMail. Unlike competing sites that relied ongeneric banner ads, Google customized the ads it displayed based onthe content of user's emails. This generated a lot of controversy.Some people didn't like the idea of Google's servers "scanning"peoples' email. A California state senator even proposed legislation to block the service, and theWorld Privacy Forum and more than two dozen other privacy and civilliberties groups urged Google to suspend Gmail.

Yet despite the concerns of privacy advocates, users beat downGoogle's door for a GMail account. A lot of users thought Gmail'sunique features - a groundbreaking user interface and a ton ofstorage space - were worth the risks. As I argued at the time, GMail'scritics missed three important points. First, no one other than therecipients ever saw the contents of users' emails. It's true thatemails are "scanned" by Google's ad-placement algorithms, but it'sequally true that the emails are "scanned" by Google's spam filter.Few object to that. Second, users appreciate that Gmail's targetedads tend to be less numerous and more relevant than traditionalbanner ads. Finally, the increased revenues generated by Google'sad-targeting program helped finance the expense of providing eachuser with a then-astronomical amount of free storage space.

A more recent privacy controversy involved Facebook's news feed.In September 2006, Facebook announced that it would begin offering a feature that helped users trackchanges to their friends' online profiles. Within days of itsSeptember 2006 introduction, 700,000 Facebook users had joined a group opposing the new feature.Facebook scrambled to add additional privacy controls, givingusers a say over which categories of information would appear inthe news feed. But they refused to back down on the basic concept,and the feature remained on by default. As with GMail, Facebookusers' outrage dissipated once they got the opportunity to try thenews feed for themselves. They discovered that they liked theability to efficiently keep up with their friends' activities onthe site. Indeed, the feature has proven so popular that MySpacerecently unveiled a news feed - like feature ofits own, although they've been more cautious about deployingit.

Too often, scholars make the mistake of treating privacy as aone-dimensional attribute, with more privacy always being betterthan less. But some amount of information sharing is almost alwaysa pre-requisite to providing valuable services to users. Totalprivacy would often mean total isolation from the Internet, andtherefore no online services at all.

In reality, privacy involves a series of trade-offs. In anyonline interaction, users give up a certain amount of personalinformation in exchange for something of value. The goal,therefore, should not be to maximize "privacy" in the abstract, butto maximize users' knowledge and control over privacy decisions sothey can maximize their welfare as they define it for themselves.Legal rules that categorically prohibit online services fromretaining particular types of information or from using it inparticular ways ultimately may harm users by preventingexperimentation that will help both users and online providerslearn how privacy ought to work.

Similar experimentation with government data collection andsurveillance is not nearly so welcome. Privacy advocates andconsumers are right to be worried that data collected in theprivate sector will be converted to other uses, up to and includingmassive data-collection programs like the ill-fated Total Information Awarenessprogram. TIA was defunded in 2003, but the dream of usingAmericans' data for security lives on. Never mind that masssurveillance and predictive data mining are ineffective at discovering terrorism. But thisis not a reason to regulate innovators in the voluntary sector.Rather, there should be a united push to restore or establishlimits on government data collection and use that are appropriatefor the online world we live in today.

As to private companies, regulators should remember thatdifferent users will have different levels of concern about theways their information might be used. Some will be very selectiveabout the information they're willing to share. Others may bewilling to share more information in exchange for better or moreconvenient online services. A well-functioning market will allowcompanies to experiment with a variety of policies and allowconsumers to choose the ones they prefer.

Facebook's Beacon program is an example of a privacy tradeoffthat most users did not find attractive. Customers spoke andFacebook responded to their complaints. Whatever problems Beaconmight have caused at the time, they caused far less damage thanwhat regulators would do if they enacted new rules that preventedfurther experimentation.