Controversial initiatives have included biometric face cameras, wiretap enhancements, invasive computer‐assisted airline passenger screening, escalated e‐mail monitoring fostered by the USA Patriot Act, and the Pentagon’s Total Information Awareness data‐mining project (now renamed the “Terrorism” Information Awareness, or TIA). Even a national ID card was proposed.
In the right circumstances, data‐mining technologies and “biometrics” — such as voice prints, retina, iris and face scanners, digitized fingerprints, and even implantable chips — can benefit us. That’s because data‐mining and biometrics, at least in principle, are about enhancing convenience, service, authentication, and individual security more than they are about invading privacy. Biometrics, for example, promises increased privacy and security by guarding against identity theft in our myriad marketplace transactions. We’ll see their use in cell phones, laptops, car doors, doorknobs and office keys — basically everywhere. They can increase security in online commerce, help locate a lost youngster, relay medical information to doctors, and much more.
But inherently “invasive” technologies like these can threaten fundamental values of privacy and liberty if misused. No one wants to be treated like a human bar code by the authorities, or monitored around the clock by the Homeland Security Department. Thus, we need a framework by which to distinguish appropriate and inappropriate uses or surveillance‐enabling technologies.
The most pressing threat to liberty is a compulsory database encompassing everyone. Examples are a mandatory National ID with biometric identifiers, or involuntary data‐mining like the TIA that would permit real‐time monitoring of our whereabouts, movements and transactions. This is a Big Brother scenario, one of constant surveillance or harassment of citizens unrelated to addressing terrorist threats. You can’t opt‐out.
Compulsory databases would undermine the many potential benefits of authentication technologies. If government is hell‐bent on assembling and mining massive databases of our credit card purchases, car rentals, library books, airline ticket purchases, and so on, then banks, airlines, hotels, Internet service providers, and other private businesses we deal with have no choice but to routinely transfer our private information to the government against our wishes. They cannot promise to safeguard our privacy as they otherwise could.
Another threat, but less sweeping, is a partial governmental database containing details on criminals and suspects, not the general population. An example would be biometric face recognition camera systems deployed in public places. Individuals are observed as they pass, which is creepy, but presumably only to see if they match a face already in the underlying database. Allegedly, the substantive information collection — that pertaining to the criminals — has already taken place under appropriate Fourth Amendment procedures, and no data are ever collected on passersby not already in the database. However, many doubt governments can be trusted to discard incidental data collected on innocents. Indeed, the needed safeguards against abuse of such systems do not yet exist.
To safeguard civil liberties in the new surveillance state enabled by digital technologies, there are basically three requirements: (1) avoid mandatory databases or any form of National ID, because they violate the 4th Amendment, and because government’s dominance of the evolution of these technologies would effectively destroy the privacy sector’s ability to offer any privacy guarantees to us at all; (2) ensure 4th amendment protections even for surveillance in open, public places, and (3) avoid the mixing of public (compulsory) and private (voluntary) databases as new technologies emerge and proliferate.
While people have alternatives to dealing with private parties that snoop too much, they have little protection against an overly suspicious government. Thus, government must not have access to our private information without going through the appropriate legal hurdles. On the other side of the coin, instead of piggybacking on government‐mandated information, private industry should be forced to generate its own databases, for purposes limited by the market’s twin engines of consumer choice — and consumer rejection.
Countless private uses of biometrics offer the opportunity for extraordinary security by preventing others from posing as us. This is where the new contingent of “privacy invading” technologies can shine. But if private applications of biometrics and data‐mining merely piggyback on data gleaned by government coercion, they will give the entire industry a black eye, and make it impossible to defend the industry from regulation.
In the new “surveillance” state, or whatever we call the rise of government‐run biometrics, cameras, compulsory IDs and data‐mining, keeping public and private data separate is critical for the health of our civil liberties, our personal privacy, and even for the health of industries specializing in authentication technologies and techniques. New technologies always bring risks. But even the risks of a “database nation” are controllable if we adhere to constitutional principle. Orwell’s Big Brother need not win.