Probabilistic relational reasoning for differential privacy
- 18 January 2012
- journal article
- research article
- Published by Association for Computing Machinery (ACM) in ACM SIGPLAN Notices
- Vol. 47 (1), 97-110
- https://doi.org/10.1145/2103621.2103670
Abstract
Differential privacy is a notion of confidentiality that protects the privacy of individuals while allowing useful computations on their private data. Deriving differential privacy guarantees for real programs is a difficult and error-prone task that calls for principled approaches and tool support. Approaches based on linear types and static analysis have recently emerged; however, an increasing number of programs achieve privacy using techniques that cannot be analyzed by these approaches. Examples include programs that aim for weaker, approximate differential privacy guarantees, programs that use the Exponential mechanism, and randomized programs that achieve differential privacy without using any standard mechanism. Providing support for reasoning about the privacy of such programs has been an open problem. We report on CertiPriv, a machine-checked framework for reasoning about differential privacy built on top of the Coq proof assistant. The central component of CertiPriv is a quantitative extension of a probabilistic relational Hoare logic that enables one to derive differential privacy guarantees for programs from first principles. We demonstrate the expressiveness of CertiPriv using a number of examples whose formal analysis is out of the reach of previous techniques. In particular, we provide the first machine-checked proofs of correctness of the Laplacian and Exponential mechanisms and of the privacy of randomized and streaming algorithms from the recent literature.Keywords
This publication has 27 references indexed in Scilit:
- Formal Verification of Differential Privacy for Interactive Systems (Extended Abstract)Electronic Notes in Theoretical Computer Science, 2011
- Computer-Aided Security Proofs for the Working CryptographerLecture Notes in Computer Science, 2011
- A firm foundation for private data analysisCommunications of the ACM, 2011
- Computational Differential PrivacyLecture Notes in Computer Science, 2009
- Distributed Private Data Analysis: Simultaneously Solving How and WhatLecture Notes in Computer Science, 2008
- Differential PrivacyLecture Notes in Computer Science, 2006
- Calibrating Noise to Sensitivity in Private Data AnalysisLecture Notes in Computer Science, 2006
- Probabilistic guarded commands mechanized in HOLTheoretical Computer Science, 2005
- Secure Information Flow as a Safety ProblemLecture Notes in Computer Science, 2005
- Probabilistic Extensions of Process Algebras**This chapter is dedicated to the fond memory of Linda Christoff.Published by Elsevier BV ,2001