Heterogeneous Differential Privacy

Main Article Content

Mohammad Alaggan
Sébastien Gambs
Anne-Marie Kermarrec


The massive collection of personal data by personalization systems has rendered the preservation of privacy of individuals more and more difficult. Most of the proposed approaches to preserve privacy in personalization systems usually address this issue uniformly across users, thus ignoring the fact that users have different privacy attitudes and expectations (even among their own personal data). In this paper, we propose to account for this non-uniformity of privacy expectations by introducing the concept of heterogeneous differential privacy. This notion captures both the variation of privacy expectations among users as well as across different pieces of information related to the same user. We also describe an explicit mechanism achieving heterogeneous differential privacy,  which is a modification of the Laplacian mechanism by Dwork, McSherry, Nissim and Smith. In a nutshell, this mechanism achieves heterogeneous differential privacy by manipulating the sensitivity of the function using a linear transformation on the input domain. Finally, we evaluate on real datasets the impact of the proposed  mechanism with respect to a semantic clustering task. The results of our experiments demonstrate that heterogeneous differential privacy can account for different privacy attitudes while sustaining a good level of utility as measured by the recall for the semantic clustering task.

Article Details

How to Cite
Alaggan, Mohammad, Sébastien Gambs, and Anne-Marie Kermarrec. 2017. “Heterogeneous Differential Privacy”. Journal of Privacy and Confidentiality 7 (2). https://doi.org/10.29012/jpc.v7i2.652.