Journal of Privacy and Confidentiality https://journalprivacyconfidentiality.org/index.php/jpc <p>The <em>Journal of Privacy and Confidentiality</em>&nbsp;is an open-access multi-disciplinary journal whose purpose is to facilitate the coalescence of research methodologies and activities in the areas of privacy, confidentiality, and disclosure limitation. The JPC seeks to publish a wide range of research and review papers, not only from academia, but also from government (especially official statistical agencies) and industry, and to serve as a forum for exchange of views, discussion, and news.</p> en-US <p>Copyright is retained by the authors. By submitting to this journal, the author(s) license the article under the <a href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Creative Commons License – Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)</a>, unless choosing a more lenient license (for instance, public domain). For situations not allowed under CC BY-NC-ND, short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source.</p> <p>Authors of articles published by the journal grant the journal the right to store the articles in its databases for an unlimited period of time and to distribute and reproduce the articles electronically.</p> managing-editor@journalprivacyconfidentiality.org (Rachel Cummings) support@journalprivacyconfidentiality.org (Sara Brooks) Thu, 30 Apr 2026 22:29:53 +0000 OJS 3.3.0.13 http://blogs.law.harvard.edu/tech/rss 60 Maximizing Utility for Vector-weighted Pseudo Posterior Mechanisms under Differential Privacy https://journalprivacyconfidentiality.org/index.php/jpc/article/view/979 <p>The risk-weighted pseudo posterior mechanism provides a practical framework for privacy protection that takes advantage of the availability of posterior sampling approaches, creating a synthesizer for microdata dissemination. The flexibility of the approach lies in the user-specification of the individualized risks and the mapping of risks to weights. However, this raises the question of which weighting approach is optimal. In this work, we develop a recursive approach to algorithmically induce an optimal weighting strategy given an initial suboptimal strategy. This ` re-weighting' strategy applies to any vector-weighted pseudo posterior mechanism under which a vector of observation-indexed weights are used to downweight likelihood contributions for high disclosure risk records. We demonstrate our method on two different vector-weighted schemes that target high-risk records (one close to optimal and one not). Our new method for constructing record-indexed downweighting maximizes the data utility under any privacy budget for the vector-weighted synthesizers by adjusting the by-record weights, such that their individual risk contributions (e.g. Lipschitz bounds) approach the risk bound for the entire database. Our method achieves an epsilon-asymptotic differential privacy (aDP) guarantee, globally, over the space of databases. We illustrate our methods using simulated highly skewed count data and compare the results to a scalar-weighted synthesizer under the popular Exponential Mechanism (EM). We also apply our methods to a sample of the Survey of Doctorate Recipients and demonstrate the practicality of our methods.</p> Jingchen Hu, Terrance Savitsky, Matthew Williams Copyright (c) 2026 Monika Hu, Terrance Savitsky, Matthew Williams https://creativecommons.org/licenses/by-nc-nd/4.0 https://journalprivacyconfidentiality.org/index.php/jpc/article/view/979 Thu, 30 Apr 2026 00:00:00 +0000 Slowly Scaling Per-Record Differential Privacy https://journalprivacyconfidentiality.org/index.php/jpc/article/view/992 <p>We develop formal privacy mechanisms for releasing statistics from data with many outlying values, such as income data. These mechanisms ensure that a per-record differential privacy guarantee degrades slowly in the protected records’ influence on the statistics being released.</p> <p>Records with greater influence -- those whose addition or deletion would change the released statistics more -- typically suffer greater privacy loss. The per-record differential privacy framework quantifies these record-specific privacy guarantees, but existing mechanisms let these guarantees degrade rapidly (linearly or quadratically) with influence. While this may be acceptable in cases with some moderately influential records, it results in unacceptably high privacy losses when records’ influence varies widely, as is common in economic data.</p> <p>We develop mechanisms with privacy guarantees that instead degrade as slowly as logarithmically with influence. These mechanisms allow for the accurate, unbiased release of statistics, while providing meaningful protection for highly influential records. As an example, we consider the private release of sums of unbounded establishment data such as payroll, where our mechanisms extend meaningful privacy protection even to very large establishments. We evaluate these mechanisms empirically and demonstrate their utility on simulated employment data and the U.S. Department of Agriculture's Cattle Inventory Survey.</p> Brian Finley, Anthony Caruso, Justin Doty, Ashwin Machanavajjhala, Mikaela Meyer, David Pujol, William Sexton, Zachary Terner Copyright (c) 2026 Brian Finley, Anthony Caruso, Justin Doty, Ashwin Machanavajjhala, Mikaela Meyer, David Pujol, William Sexton, Zachary Terner https://creativecommons.org/licenses/by-nc-nd/4.0 https://journalprivacyconfidentiality.org/index.php/jpc/article/view/992 Thu, 30 Apr 2026 00:00:00 +0000 Corrigendum to "Differentially Private Set Union" https://journalprivacyconfidentiality.org/index.php/jpc/article/view/1053 <p>In the published version of "Differentially Private Set Union" by Gopi et. al,&nbsp;the authors identified an error in Proposition 5.1. This error affects Proposition 5.1 and Algorithm 8 of the published version.</p> Copyright (c) 2026 https://creativecommons.org/licenses/by-nc-nd/4.0 https://journalprivacyconfidentiality.org/index.php/jpc/article/view/1053 Thu, 30 Apr 2026 00:00:00 +0000