Journal of Privacy and Confidentiality
https://journalprivacyconfidentiality.org/index.php/jpc
<p>The <em>Journal of Privacy and Confidentiality</em> is an openaccess multidisciplinary journal whose purpose is to facilitate the coalescence of research methodologies and activities in the areas of privacy, confidentiality, and disclosure limitation. The JPC seeks to publish a wide range of research and review papers, not only from academia, but also from government (especially official statistical agencies) and industry, and to serve as a forum for exchange of views, discussion, and news.</p>
Cornell University, ILR School
enUS
Journal of Privacy and Confidentiality
25758527
<p>Copyright is retained by the authors. By submitting to this journal, the author(s) license the article under the <a href="https://creativecommons.org/licenses/byncnd/4.0/">Creative Commons License – AttributionNonCommercialNoDerivatives 4.0 International (CC BYNCND 4.0)</a>, unless choosing a more lenient license (for instance, public domain). Furthermore, the authors of articles published by the journal grant the journal the right to store the articles in its databases for an unlimited period of time and to distribute and reproduce the articles electronically.</p> <p>Short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source.</p>

Special Issue on the Theory and Practice of Differential Privacy 2016
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/717
Marco Gaboardi
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190330
20190330
9 1
10.29012/jpc.717

Make Up Your Mind: The Price of Online Queries in Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/655
<p>We consider the problem of answering queries about a sensitive dataset subject to differential privacy. The queries may be chosen adversarially from a larger set $Q$ of allowable queries in one of three ways, which we list in order from easiest to hardest to answer:<br><br>Offline: The queries are chosen all at once and the differentially private mechanism answers the queries in a single batch.</p> <p><br>Online: The queries are chosen all at once, but the mechanism only receives the queries in a streaming fashion and must answer each query before seeing the next query.</p> <p>Adaptive: The queries are chosen one at a time and the mechanism must answer each query before the next query is chosen. In particular, each query may depend on the answers given to previous queries.<br><br>Many differentially private mechanisms are just as efficient in the adaptive model as they are in the offline model. Meanwhile, most lower bounds for differential privacy hold in the offline setting. This suggests that the three models may be equivalent.</p> <p>We prove that these models are all, in fact, distinct. Specifically, we show that there is a family of statistical queries such that exponentially more queries from this family can be answered in the offline model than in the online model. We also exhibit a family of search queries such that exponentially more queries from this family can be answered in the online model than in the adaptive model. We also investigate whether such separations might hold for simple queries like threshold queries over the real line.</p>
Mark Bun
Thomas Steinke
Jonathan Ullman
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190324
20190324
9 1
10.29012/jpc.655

Concentration Bounds for High Sensitivity Functions Through Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/658
<p>A new line of work demonstrates how differential privacy can be used as a mathematical tool for guaranteeing generalization in adaptive data analysis. Specifically, if a differentially private analysis is applied on a sample S of i.i.d. examples to select a lowsensitivity function f, then w.h.p. f(S) is close to its expectation, even though f is being chosen adaptively, i.e., based on the data.</p> <p>Very recently, Steinke and Ullman observed that these generalization guarantees can be used for proving concentration bounds in the nonadaptive setting, where the lowsensitivity function is fixed beforehand. In particular, they obtain alternative proofs for classical concentration bounds for lowsensitivity functions, such as the Chernoff bound and McDiarmid's Inequality. In this work, we extend this connection between differential privacy and concentration bounds, and show that differential privacy can be used to prove concentration of highsensitivity functions.</p>
Uri Stemmer
Kobbi Nissim
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190330
20190330
9 1
10.29012/jpc.658

Differentially Private Confidence Intervals for Empirical Risk Minimization
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/660
<p>The process of data mining with differential privacy produces results that are affected by two types of noise: sampling noise due to data collection and privacy noise that is designed to prevent the reconstruction of sensitive information. In this paper, we consider the problem of designing confidence intervals for the parameters of a variety of differentially private machine learning models. The algorithms can provide confidence intervals that satisfy differential privacy (as well as the more recently proposed concentrated differential privacy) and can be used with existing differentially private mechanisms that train models using objective perturbation and output perturbation.</p>
Yue Wang
Daniel Kifer
Jaewoo Lee
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190330
20190330
9 1
10.29012/jpc.660

Differentially Private Ordinary Least Squares
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/654
<p>Linear regression is one of the most prevalent techniques in machine learning; however, it is also common to use linear regression for its <em>explanatory</em> capabilities rather than label prediction. Ordinary Least Squares (OLS) is often used in statistics to establish a correlation between an attribute (e.g. gender) and a label (e.g. income) in the presence of other (potentially correlated) features. OLS assumes a particular model that randomly generates the data, and derives <em>tvalues</em>  representing the likelihood of each real value to be the true correlation. Using <em>t</em>values, OLS can release a <em>confidence interval</em>, which is an interval on the reals that is likely to contain the true correlation; and when this interval does not intersect the origin, we can <em>reject the null hypothesis</em> as it is likely that the true correlation is nonzero.<br>Our work aims at achieving similar guarantees on data under differentially private estimators. First, we show that for wellspread data, the Gaussian JohnsonLindenstrauss Transform (JLT) gives a very good approximation of <em>t</em>values; secondly, when JLT approximates Ridge regression (linear regression with <em>l<sub>2</sub></em>regularization) we derive, under certain conditions, confidence intervals using the projected data; lastly, we derive, under different conditions, confidence intervals for the "Analyze Gauss" algorithm of Dwork et al (STOC 2014).</p>
Or Sheffet
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190330
20190330
9 1
10.29012/jpc.654

Perinstance Differential Privacy
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/662
<p>We consider a refinement of differential privacy  per instance differential privacy (pDP), which captures the privacy of a specific individual with respect to a fixed data set. We show that this is a strict generalization of the standard DP and inherits all its desirable properties, e.g., composition, invariance to side information and closedness to postprocessing, except that they all hold for every instance separately. We consider a refinement of differential privacy  per instance differential privacy (pDP), which captures the privacy of a specific individual with respect to a fixed data set. We show that this is a strict generalization of the standard DP and inherits all its desirable properties, e.g., composition, invariance to side information and closedness to postprocessing, except that they all hold for every instance separately. When the data is drawn from a distribution, we show that perinstance DP implies generalization. Moreover, we provide explicit calculations of the perinstance DP for the output perturbation on a class of smooth learning problems. The result reveals an interesting and intuitive fact that an individual has stronger privacy if he/she has small ``leverage score'' with respect to the data set and if he/she can be predicted more accurately using the leaveoneout data set. Simulation shows several ordersofmagnitude more favorable privacy and utility tradeoff when we consider the privacy of only the users in the data set. In a case study on differentially private linear regression, provide a novel analysis of the OnePosteriorSample (OPS) estimator and show that when the data set is wellconditioned it provides $(\epsilon,\delta)$pDP for any target individuals and matches the exact lower bound up to a $1+\tilde{O}(n^{1}\epsilon^{2})$ multiplicative factor. We also demonstrate how we can use a ``pDP to DP conversion'' step to design AdaOPS which uses adaptive regularization to achieve the same results with $(\epsilon,\delta)$DP.</p>
YuXiang Wang
##submission.copyrightStatement##
http://creativecommons.org/licenses/byncnd/4.0
20190330
20190330
9 1
10.29012/jpc.662

Program for TPDP 2016
https://journalprivacyconfidentiality.org/index.php/jpc/article/view/699
<p>The Theory and Practice of Differential Privacy workshop (TPDP 2016) was held in New York City, NY, USA on 23 June 2016 as part of ICML 2016. This is the program.</p>
Gilles Barthe
Christos Dimitrakakis
Marco Gaboardi
Andreas Haeberlen
Aaron Roth
Aleksandra B Slavković
##submission.copyrightStatement##
http://creativecommons.org/licenses/by/4.0
20190324
20190324
9 1
10.29012/jpc.699