The Fienberg Problem: How to Allow Human Interactive Data Analysis in the Age of Differential Privacy
Main Article Content
Abstract
Differential Privacy is a popular technology for privacy-preserving analysis of large datasets. DP is powerful, but it requires that the analyst interact with data only through a special interface; in particular, the analyst does not see raw data, an uncomfortable situation for anyone trained in classical statistical data analysis. In this note we discuss the (overly) simple problem of allowing a trusted analyst to choose an ``"interesting" statistic for popular release (the actual computation of the chosen statistic will be carried out in a differentially private way).
Article Details
Copyright is retained by the authors. By submitting to this journal, the author(s) license the article under the Creative Commons License – Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0), unless choosing a more lenient license (for instance, public domain). For situations not allowed under CC BY-NC-ND, short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source.
Authors of articles published by the journal grant the journal the right to store the articles in its databases for an unlimited period of time and to distribute and reproduce the articles electronically.