The Discrete Gaussian for Differential Privacy

Main Article Content

Clement Canonne
Gautam Kamath
https://orcid.org/0000-0003-0048-2559
Thomas Steinke

Abstract

A key tool for building differentially private systems is adding Gaussian noise to the output of a function evaluated on a sensitive dataset. Unfortunately, using a continuous distribution presents several practical challenges. First and foremost, finite computers cannot exactly represent samples from continuous distributions, and previous work has demonstrated that seemingly innocuous numerical errors can entirely destroy privacy. Moreover, when the underlying data is itself discrete (e.g., population counts), adding continuous noise makes the result less interpretable.


With these shortcomings in mind, we introduce and analyze the discrete Gaussian in the context of differential privacy. Specifically, we theoretically and experimentally show that adding discrete Gaussian noise provides essentially the same privacy and accuracy guarantees as the addition of continuous Gaussian noise. We also present an simple and efficient algorithm for exact sampling from this distribution. This demonstrates its applicability for privately answering counting queries, or more generally, low-sensitivity integer-valued queries.

Article Details

How to Cite
Canonne, Clement, Gautam Kamath, and Thomas Steinke. 2022. “The Discrete Gaussian for Differential Privacy”. Journal of Privacy and Confidentiality 12 (1). https://doi.org/10.29012/jpc.784.
Section
Articles

Similar Articles

1 2 3 4 5 > >> 

You may also start an advanced similarity search for this article.