Journal of Privacy and Confidentiality
https://journalprivacyconfidentiality.org/index.php/jpc
<p>The <em>Journal of Privacy and Confidentiality</em> is an open-access multi-disciplinary journal whose purpose is to facilitate the coalescence of research methodologies and activities in the areas of privacy, confidentiality, and disclosure limitation. The JPC seeks to publish a wide range of research and review papers, not only from academia, but also from government (especially official statistical agencies) and industry, and to serve as a forum for exchange of views, discussion, and news.</p>
Society for Privacy and Confidentiality Research
en-US
Journal of Privacy and Confidentiality
2575-8527
<p>Copyright is retained by the authors. By submitting to this journal, the author(s) license the article under the <a href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Creative Commons License – Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)</a>, unless choosing a more lenient license (for instance, public domain). For situations not allowed under CC BY-NC-ND, short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source.</p> <p>Authors of articles published by the journal grant the journal the right to store the articles in its databases for an unlimited period of time and to distribute and reproduce the articles electronically.</p>
-
Reconstruction Attacks on Aggressive Relaxations of Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/871
<p>Differential privacy is a widely accepted formal privacy definition that allows aggregate information about a dataset to be released while controlling privacy leakage for individuals whose records appear in the data. Due to the unavoidable tension between privacy and utility, there have been many works trying to relax the requirements of differential privacy to achieve greater utility.<br>One class of relaxation, which is gaining support outside the privacy community is embodied by the definitions of individual differential privacy (IDP) and bootstrap differential privacy (BDP). Classical differential privacy defines a set of neighboring database pairs and achieves its privacy guarantees by requiring that each pair of neighbors should be nearly indistinguishable to an attacker. The privacy definitions we study, however, aggressively reduce the set of neighboring pairs that are protected.<br>To a non-expert, IDP and BDP can seem very appealing as they echo the same types of privacy explanations that are associated with differential privacy, and also experimentally achieve dramatically better utility. However, we show that they allow a significant portion of the dataset to be reconstructed using algorithms that have arbitrarily low privacy loss under their privacy accounting rules.<br>With the non-expert in mind, we demonstrate these attacks using the preferred mechanisms of these privacy definitions. In particular, we design a set of queries that, when protected by these mechanisms with high noise settings (i.e., with claims of very low privacy loss), yield more precise information about the dataset than if they were not protected at all. The specific attacks here can be defeated and we give examples of countermeasures. However, the defenses are either equivalent to using differential privacy or to ad-hoc methods tailored specifically to the attack (with no guarantee that they protect against other attacks). Thus, the defenses emphasize the deficiencies of these privacy definitions.</p>
Prottay Protivash
John Durrell
Daniel Kifer
Zeyu Ding
Danfeng Zhang
Copyright (c) 2024 Prottay Protivash, John Durrell, Daniel Kifer, Zeyu Ding, Danfeng Zhang
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.871
-
Foreword for the Collection of Papers from the Workshop on the Analysis of Census Noisy Measurement Files and Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/934
<p>The 2022 Workshop on the Analysis of Census Noisy Measurement Files and Differential Privacy brought together research experts from many domains of social sciences, demography, public policy, statistics, and computer science to address key challenges in the use of the differentially private Census noisy measurement files to support social research and policy decisions.</p>
Jörg Drechsler
Ruobin Gong
Weijie Su
Linjun Zhang
Copyright (c) 2024 Jörg Drechsler
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.934
-
Geographic Spines in the 2020 Census Disclosure Avoidance System
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/875
<p>The 2020 Census Disclosure Avoidance System (DAS) is a formally private mechanism that first adds independent noise to cross tabulations for a set of pre-specified hierarchical geographic units, which is known as the geographic spine. After post-processing these noisy measurements, DAS outputs a formally private database with fields indicating location in the standard census geographic spine, which is defined by the United States as a whole, states, counties, census tracts, block groups, and census blocks. This paper describes how the geographic spine used internally within DAS to define the initial noisy measurements impacts accuracy of the output database. Specifically, tabulations for geographic areas tend to be most accurate for geographic areas that both 1) can be derived by aggregating together geographic units above the block geographic level of the internal spine, and 2) are closer to the geographic units of the internal spine. After describing the accuracy tradeoffs relevant to the choice of internal DAS geographic spine, we provide the settings used to define the 2020 Census production DAS runs.</p>
Ryan Cumings-Menon
Robert Ashmead
Daniel Kifer
Philip Leclerc
Jeffrey Ocker
Michael Ratcliffe
Pavel Zhuravlev
John Abowd
Copyright (c) 2024 Ryan Cumings-Menon, John M. Abowd, Robert Ashmead, Daniel Kifer, Philip Leclerc, Jeffrey Ocker, Michael Ratcliffe, Pavel Zhuravlev
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.875
-
Perspective: Better Privacy Theorists for Better Data Stewards
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/865
<p>The U.S. Census Bureau's use of differential privacy (DP) fundamentally changed how academic DP researchers perform outreach with official statistics stakeholders. In this perspectives piece, I propose ways for us in this community to improve those processes by being more receptive to the practical concerns raised by building DP systems. First, I discuss how academic DP work fundamentally differs from the policy decisions needed to implement DP systems and why this distinction has political consequences. Through examples and discussions from workshops, I show how the DP community largely asked applied stakeholders to communicate on DP's theoretical terms, when such an ask foreclosed important considerations relevant for the Census Bureau's policy problems. Second, I discuss how existing polarization between theoretical and empirical privacy researchers unintentionally seeped into the ways we communicated about DP, pointing to why both perspectives are necessary in different ways for policy conversations. Finally, I conclude by discussing how these issues are not unique to data privacy work, but instead reflect structural problems in translating theoretical science into practice. These ideas are presented in service of a single goal: to ensure DP theory supports substantive, privacy-aware data processing and dissemination in practice for essential data curators.</p>
Jeremy Seeman
Copyright (c) 2024 Jeremy Seeman
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.865
-
Incompatibilities Between Current Practices in Statistical Data Analysis and Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/872
<p>The authors discuss their experience applying differential privacy with a complex data set with the goal of enabling standard approaches to statistical data analysis. They highlight lessons learned and roadblocks encountered, distilling them into incompatibilities between current practices in statistical data analysis and differential privacy that go beyond issues which can be solved with a noisy measurements file. The authors discuss how overcoming these incompatibilities require compromise and a change in either our approach to statistical data analysis or differential privacy that should be addressed head-on.</p>
Joshua Snoke
Claire McKay Bowen
Aaron R. Williams
Andrés F. Barrientos
Copyright (c) 2024 Joshua Snoke, Claire McKay Bowen, Aaron R. Williams, Andrés F. Barrientos
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.872
-
Launching the Society for Privacy and Confidentiality Research to Own the Journal of Privacy and Confidentiality
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/939
<p>We describe the launching of the Society for Privacy and Confidentiality Research (SPCR). SPCR is the new owner of the Journal of Privacy and Confidentiality, with the goal of ensuring a sustainable future for the Journal, and continuing to publish the multidisciplinary, open access journal that JPC has been since its founding in 2008.</p>
John M. Abowd
Cynthia Dwork
Alan F. Karr
Kobbi Nissim
Jerome Reiter
Aleksandra Slavković
Lars Vilhuber
Copyright (c) 2024 John M. Abowd, Cynthia Dwork, Alan F. Karr, Kobbi Nissim, Jerome Reiter, Aleksandra Slavković; Lars Vilhuber
https://creativecommons.org/licenses/by-nc-nd/4.0
2024-08-27
2024-08-27
14 3
10.29012/jpc.939