Manipulation Attacks in Local Differential Privacy

Main Article Content

Albert Cheu
Adam Smith
Jonathan Ullman

Abstract

Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are highly vulnerable to adversarial manipulation. While any algorithm can be manipulated by adversaries who lie about their inputs, we show that any noninteractive locally differentially private protocol can be manipulated to a much greater extent---when the privacy level is high, or the domain size is large, a small fraction of users in the protocol can completely obscure the distribution of the honest users' input. We also construct protocols that are optimally robust to manipulation for a variety of common tasks in local differential privacy. Finally, we give simple experiments validating our  theoretical results, and demonstrating that protocols that are optimal without manipulation can have dramatically different levels of robustness to manipulation. Our results suggest caution when deploying local differential privacy and reinforce the importance of efficient cryptographic  techniques for the distributed emulation of centrally differentially private mechanisms.

Article Details

How to Cite
Cheu, Albert, Adam Smith, and Jonathan Ullman. 2021. “Manipulation Attacks in Local Differential Privacy”. Journal of Privacy and Confidentiality 11 (1). https://doi.org/10.29012/jpc.754.
Section
TPDP 2019

Funding data

Similar Articles

<< < 4 5 6 7 8 9 10 11 12 > >> 

You may also start an advanced similarity search for this article.

Most read articles by the same author(s)

1 2 > >>