|Title:||Differential Privacy for Probabilistic Systems|
|Authors:||Michael Carl Tschantz, Anupam Datta, Dilsun Kaynar|
|Publication Date:||May 14, 2009|
Differential privacy is a promising approach to privacy-preserving data analysis. There is now a well-developed theory of differentially private functions. Despite recent work on implementing database systems that aim to provide differential privacy and distributed systems that use differential privacy as a basis for higher level security properties, there is no formal theory of differential privacy for systems. In this paper, we formulate precise definitions of differential privacy within a formal model of probabilistic systems, relate these definitions to the original definitions, and develop a proof technique based on an unwinding relation for establishing that a given system achieves this privacy definition. We illustrate the proof technique on a representative example motivated by an implemented system.
Full Report: CMU-CyLab-09-008