Skip to main content
Menu

Main navigation

  • About
    • Annual Reports
    • Leadership
    • Jobs
    • Student Programs
    • Media Information
    • Store
    • Contact
    LOADING...
  • Experts
    • Policy Scholars
    • Adjunct Scholars
    • Fellows
  • Events
    • Upcoming
    • Past
    • Event FAQs
    • Sphere Summit
    LOADING...
  • Publications
    • Studies
    • Commentary
    • Books
    • Reviews and Journals
    • Public Filings
    LOADING...
  • Blog
  • Donate
    • Sponsorship Benefits
    • Ways to Give
    • Planned Giving
    • Meet the Development Team

Issues

  • Constitution and Law
    • Constitutional Law
    • Criminal Justice
    • Free Speech and Civil Liberties
  • Economics
    • Banking and Finance
    • Monetary Policy
    • Regulation
    • Tax and Budget Policy
  • Politics and Society
    • Education
    • Government and Politics
    • Health Care
    • Poverty and Social Welfare
    • Technology and Privacy
  • International
    • Defense and Foreign Policy
    • Global Freedom
    • Immigration
    • Trade Policy
Live Now

Cato at Liberty


  • Blog Home
  • RSS

Email Signup

Sign up to have blog posts delivered straight to your inbox!

Topics
  • Banking and Finance
  • Constitutional Law
  • Criminal Justice
  • Defense and Foreign Policy
  • Education
  • Free Speech and Civil Liberties
  • Global Freedom
  • Government and Politics
  • Health Care
  • Immigration
  • Monetary Policy
  • Poverty and Social Welfare
  • Regulation
  • Tax and Budget Policy
  • Technology and Privacy
  • Trade Policy
Archives
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • Show More
May 1, 2018 4:23PM

Has Ray Ozzie Solved the “Going Dark” Problem?

By Julian Sanchez

SHARE

Has the thorny problem of providing law enforcement with access to encrypted data without fatally compromising user security finally been solved?  That's the bold thesis advanced by a piece at Wired that garnered an enormous amount of attention last week by suggesting that renowned computer scientist Ray Ozzie, formerly a top engineer at Microsoft, had developed an "exceptional access" proposal that "satisfies both law enforcement and privacy purists."  Alas, other experts have been conspicuously less enthusiastic, with good reason.  It's worth saying a few words about why.

In one sense, the attention garnered by Ozzie's proposal, which he's dubbed "CLEAR," is somewhat odd: There isn't much here that's fundamentally new.  A few novel wrinkles notwithstanding, Ozzie's proposal is a variant on the very old idea of "key escrow," which involves device manufacturers holding on to either a master key or a database of such keys that can be used to decrypt data at the request of law enforcement.  The proposal is limited to providing "exceptional access" to data "at rest" on a device, such as a smartphone, in the physical custody of law enforcement.  Ozzie's suggests that when a user creates a passcode to encrypt the data on a device, the passcode itself should be encrypted using the device manufacturer's public key, which is hardcoded into the cryptographic processor embedded in the device.  Then, when law enforcement wishes to access such a device in their possession, pursuant to a valid court order, they activate a special law-enforcement mode which permanently renders the device inoperable (or "bricks" it) and displays the encrypted user passcode.  This can then be sent to the manufacturer, which, upon validating that they've received a legitimate request from a real law enforcement agency with a valid warrant, uses their own private key (corresponding to the public key baked into the phone) to decrypt the original passcode and provide it to the requesting agency.  

In its broad outlines, this isn't fundamentally much different from proposals that crypto experts have considered and rejected for decades.  So why has CLEAR (and Wired's article on it) generated so much interest?  A substantial part of it simply comes down to who's offering it: Ozzie has a stellar reputation, and is offering a solution where most security experts have simply been urging governments to abandon the idea of building a police backdoor into cryptosystems.  This feeds into the seemingly widespread conviction among law enforcement types that computer scientists are really just ideologically opposed to such backdoors, and stubbornly refusing to work on developing technical solutions.  Many, moreover, may not really understand why experts tend to say such backdoors can't be built securely, and therefore believe that Ozzie's proposal does represent something fundamentally new: The "golden key" that all those other experts pretended couldn't exist.  But, of course,  cryptographers have long known a system along these lines could be built: That was never the technical problem with law enforcement backdoors. (There's perhaps some fairness to the complaint that privacy advocates haven't always been sufficiently clear about this in arguments aimed at a mass audience, which may contribute to the impression that Ozzie's proposal represents some significant breakthrough.)  Rather, the deep problem—or rather, one of several deep problems—has always been ensuring the security of that master key, or key database. 

That brings us to the second reason for the appeal of Ozzie's proposal, which is essentially a rhetorical point rather than a novel technical one. Software developers and device manufacturers, Ozzie notes, already hold "master keys" of a sort: The cryptographic signing keys used to authenticate new software updates.  The way your iPhone already confirms that a new version of iOS is really a legitimate update from Apple and not some malicious code written by hackers impersonating them superficially resembles Ozzie's proposal in reverse.  Apple uses their own private key to sign the update, and your phone confirms its authenticity using the corresponding public key baked into its cryptoprocessor.  That all-important private key is typically kept on an expensive bit of machinery called a Hardware Security Module designed to make it (in theory) possible to use the secret private key to authenticate new updates, but impossible to copy the key itself off the device.  The existence of that key does, of course, represent a security risk of a sort, but one we generally consider acceptable—far less risky than leaving users with no good way distribute authenticated security updates when bugs and vulnerabilities are discovered. Thus the argument, in effect, becomes: If it's not a wildly unacceptable risk for developers to maintain signing keys stored on an HSM, then surely it's equally acceptable to similarly maintain a "golden key" for law enforcement use.

This is, however, misleading in at least a couple of ways.  First, as Stanford's Rianna Pfefferkorn has argued in a recent paper, the use cases for signing keys—used to authenticate new software releases on perhaps a monthly basis—is very different from that of a decryption key that would almost certainly need to be accessed by human beings multiple times each day.  An asset becomes inherently harder to secure the more routinely it must be accessed by legitimate users.  Second, and perhaps more importantly, the value to an adversary of a decryption key is much higher, because it has far greater potential for clandestine use. The risks associated with stolen signing keys—and we should pause to note that signing keys do indeed get stolen on occasion—are mitigated by the fact that misuse of such keys is intrinsically public.  A falsely authenticated piece of malicious code is only useful to an adversary if that code is then deployed on a target's device, and there are a variety of mechanisms by which such a compromise is likely to be detected, at which point the key is revoked and its value to the adversary comes to an end.  Decrypting stolen data, by contrast, has no such inherently public component.  One can attempt to design an exceptional access system in a way that forces publicity about its use, but without getting too mired in the technical weeds, the fact that decryption doesn't inherently require publicity means that in most cases this just gives an attacker the secondary problem of spoofing confirmation that their decryption has been publicly logged.  Ozzie's suggestion that law enforcement decryption should  permanently brick the device being unlocked is one way of making it more difficult for an attacker to covertly extract data, but as Stanford's Pfefferkorn notes, this "solution" has significant downsides of its own given that many smartphones are quite expensive pieces of technology.  

Why does it matter that a decryption key, with its potential for clandestine (and therefore repeated) use is more value to an adversary?  Because  security is in many ways as much about economics as the fine points of engineering. The same security system that would be excessive for the purpose of safeguarding my private residence might be pathetically inadequate for a bank or an art museum, for the obvious reason that there's nothing in my house a rational adversary would dedicate hundreds of thousands of dollars'worth of resources to stealing, while a heist of the bank or the museum might well yield returns that would justify such an investment.  No security is perfect: Adequate security is security that would cost an attacker more to breach than the value they can expect to realize from that breach.  Therefore security that is adequate for an asset that is likely to be rendered useless as a result of being deployed is by no means guaranteed to be adequate for an asset that might be used many times undetected.  

There are also, of course, a host of other familiar objections one could raise to this or any other backdoor system.  If the United States government gets  such "exceptional access," shouldn't we expect other, nastier regimes to demand the same?  Won't even moderately sophisticated criminals simply cease relying on compromised hardware based encryption and instead add a layer of software-based encryption sans backdoor, rendering the whole elaborate scheme ineffective?  

Even if we restrict ourselves to the narrower security question, however, Ozzie's proposal seems susceptible to the same response other key escrow systems face, and that response is as much about economics as technology: Any master key, any centralized mechanism for compromising millions of individual devices, is too valuable to reliably secure against the sort of adversaries likely to be most interested in acquiring it.   

Related Tags
Technology and Privacy

Stay Connected to Cato

Sign up for the newsletter to receive periodic updates on Cato research, events, and publications.

View All Newsletters

1000 Massachusetts Ave. NW
Washington, DC 20001-5403
202-842-0200
Contact Us
Privacy

Footer 1

  • About
    • Annual Reports
    • Leadership
    • Jobs
    • Student Programs
    • Media Information
    • Store
    • Contact
  • Podcasts

Footer 2

  • Experts
    • Policy Scholars
    • Adjunct Scholars
    • Fellows
  • Events
    • Upcoming
    • Past
    • Event FAQs
    • Sphere Summit

Footer 3

  • Publications
    • Books
    • Cato Journal
    • Regulation
    • Cato Policy Report
    • Cato Supreme Court Review
    • Cato’s Letter
    • Human Freedom Index
    • Economic Freedom of the World
    • Cato Handbook for Policymakers

Footer 4

  • Blog
  • Donate
    • Sponsorship Benefits
    • Ways to Give
    • Planned Giving
Also from Cato Institute:
Libertarianism.org
|
Humanprogress.org
|
Downsizinggovernment.org