Journal of Privacy and Confidentiality https://journalprivacyconfidentiality.org/index.php/jpc <p>The <em>Journal of Privacy and Confidentiality</em>&nbsp;is an open-access multi-disciplinary journal whose purpose is to facilitate the coalescence of research methodologies and activities in the areas of privacy, confidentiality, and disclosure limitation. The JPC seeks to publish a wide range of research and review papers, not only from academia, but also from government (especially official statistical agencies) and industry, and to serve as a forum for exchange of views, discussion, and news.</p> Cornell University, ILR School en-US Journal of Privacy and Confidentiality 2575-8527 <p>Copyright is retained by the authors. By submitting to this journal, the author(s) license the article under the <a href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Creative Commons License – Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)</a>, unless choosing a more lenient license (for instance, public domain). For situations not allowed under CC BY-NC-ND, short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source.</p> <p>Authors of articles published by the journal grant the journal the right to store the articles in its databases for an unlimited period of time and to distribute and reproduce the articles electronically.</p> Expansion, perspectives, and challenges https://journalprivacyconfidentiality.org/index.php/jpc/article/view/777 <p>The present issue heralds a new phase for the <em>Journal</em>. In our ongoing efforts to provide a forum for a diverse community interested in privacy and confidentiality, we welcome a large expansion of the editorial board. We also introduce not one, but two new sections to the <em>Journal</em>. <em>Perspectives</em> are shorter articles, possibly somewhat opiniated, typically clustered around a same theme. The section on <em>Privacy Challenges </em>encompasses articles that describe practical implementations of privacy-preserving algorithms from a variety of competitions, starting with the NIST PSCR Challenge from 2018-2019. As every year, we also publish select contributions from the Workshop on Theory and Practice of Differential Privacy, which in 2019 was held in London, UK as part of the ACM Conference on Computer Security (CCS).</p> Lars Vilhuber Copyright (c) 2021 Lars Vilhuber http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.777 Reflections on the Successes and Challenges of Research Data Centers in Canada and the U.S. https://journalprivacyconfidentiality.org/index.php/jpc/article/view/765 Ron Jarmin Copyright (c) 2021 Ron Jarmin http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.765 Microdata access and privacy https://journalprivacyconfidentiality.org/index.php/jpc/article/view/766 <p>Flourinoccinihilipilification</p> Felix Ritchie Copyright (c) 2021 Felix Ritchie http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.766 Comparative Study of Differentially Private Synthetic Data Algorithms from the NIST PSCR Differential Privacy Synthetic Data Challenge https://journalprivacyconfidentiality.org/index.php/jpc/article/view/748 <p>Differentially private synthetic data generation offers a recent solution to release analytically useful data while preserving the privacy of individuals in the data. In order to utilize these algorithms for public policy decisions, policymakers need an accurate understanding of these algorithms' comparative performance. Correspondingly, data practitioners also require standard metrics for evaluating the analytic qualities of the synthetic data. In this paper, we present an in-depth evaluation of several differentially private synthetic data algorithms using actual differentially private synthetic data sets created by contestants in the recent National Institute of Standards and Technology Public Safety Communications Research (NIST PSCR) Division's ``"Differential Privacy Synthetic Data Challenge." We offer analyses of these algorithms based on both the accuracy of the data they create and their usability by potential data providers. We frame the methods used in the NIST PSCR data challenge within the broader differentially private synthetic data literature. We implement additional utility metrics, including two of our own, on the differentially private synthetic data and compare mechanism utility on three categories. Our comparative assessment of the differentially private data synthesis methods and the quality metrics shows the relative usefulness, general strengths and weaknesses, preferred choices of algorithms and metrics. Finally we describe the implications of our evaluation for policymakers seeking to implement differentially private synthetic data algorithms on future data products.</p> Claire McKay Bowen Joshua Snoke Copyright (c) 2021 Joshua Snoke, Claire McKay Bowen http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.748 Manipulation Attacks in Local Differential Privacy https://journalprivacyconfidentiality.org/index.php/jpc/article/view/754 <p>Local differential privacy is a widely studied restriction on distributed algorithms that collect aggregates about sensitive user data, and is now deployed in several large systems. We initiate a systematic study of a fundamental limitation of locally differentially private protocols: they are highly vulnerable to adversarial manipulation. While any algorithm can be manipulated by adversaries who lie about their inputs, we show that any noninteractive locally differentially private protocol can be manipulated to a much greater extent---when the privacy level is high, or the domain size is large, a small fraction of users in the protocol can completely obscure the distribution of the honest users' input. We also construct protocols that are optimally robust to manipulation for a variety of common tasks in local differential privacy. Finally, we give simple experiments validating our&nbsp; theoretical results, and demonstrating that protocols that are optimal without manipulation can have dramatically different levels of robustness to manipulation. Our results suggest caution when deploying local differential privacy and reinforce the importance of efficient cryptographic&nbsp; techniques for the distributed emulation of centrally differentially private mechanisms.</p> Albert Cheu Adam Smith Jonathan Ullman Copyright (c) 2021 Albert Cheu, Adam Smith, Jonathan Ullman http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.754 Efficiently Estimating Erdos-Renyi Graphs with Node Differential Privacy https://journalprivacyconfidentiality.org/index.php/jpc/article/view/745 <p>We give a simple, computationally efficient, and node-differentially-private algorithm for estimating the parameter of an Erdos-Renyi graph---that is, estimating p in a G(n,p)---with near-optimal accuracy. Our algorithm nearly matches the information-theoretically optimal exponential-time algorithm for the same problem due to Borgs et al. (FOCS 2018). More generally, we give an optimal, computationally efficient, private algorithm for estimating the edge-density of any graph whose degree distribution is concentrated in a small interval.</p> Adam Sealfon Jonathan Ullman Copyright (c) 2021 Adam Sealfon, Jonathan Ullman http://creativecommons.org/licenses/by-nc-nd/4.0 2021-02-03 2021-02-03 11 1 10.29012/jpc.745