Reconstruction Attacks on Aggressive Relaxations of Differential Privacy

Main Article Content

Prottay Protivash
https://orcid.org/0009-0004-7686-7599
John Durrell
https://orcid.org/0009-0009-1810-1831
Daniel Kifer
https://orcid.org/0000-0002-4611-7066
Zeyu Ding
https://orcid.org/0000-0003-1132-7079
Danfeng Zhang
https://orcid.org/0000-0003-1942-6872

Abstract

Differential privacy is a widely accepted formal privacy definition that allows aggregate information about a dataset to be released while controlling privacy leakage for individuals whose records appear in the data. Due to the unavoidable tension between privacy and utility, there have been many works trying to relax the requirements of differential privacy to achieve greater utility.
One class of relaxation, which is gaining support outside the privacy community is embodied by the definitions of individual differential privacy (IDP) and bootstrap differential privacy (BDP). Classical differential privacy defines a set of neighboring database pairs and achieves its privacy guarantees by requiring that each pair of neighbors should be nearly indistinguishable to an attacker. The privacy definitions we study, however, aggressively reduce the set of neighboring pairs that are protected.
To a non-expert, IDP and BDP can seem very appealing as they echo the same types of privacy explanations that are associated with differential privacy, and also experimentally achieve dramatically better utility. However, we show that they allow a significant portion of the dataset to be reconstructed using algorithms that have arbitrarily low privacy loss under their privacy accounting rules.
With the non-expert in mind, we demonstrate these attacks using the preferred mechanisms of these privacy definitions. In particular, we design a set of queries that, when protected by these mechanisms with high noise settings (i.e., with claims of very low privacy loss), yield more precise information about the dataset than if they were not protected at all. The specific attacks here can be defeated and we give examples of countermeasures. However, the defenses are either equivalent to using differential privacy or to ad-hoc methods tailored specifically to the attack (with no guarantee that they protect against other attacks). Thus, the defenses emphasize the deficiencies of these privacy definitions.

Article Details

How to Cite
Protivash, Prottay, John Durrell, Daniel Kifer, Zeyu Ding, and Danfeng Zhang. 2024. “Reconstruction Attacks on Aggressive Relaxations of Differential Privacy”. Journal of Privacy and Confidentiality 14 (3). https://doi.org/10.29012/jpc.871.
Section
Articles

Funding data