Software Open Access
Chetty, Raj; Friedman, John N.
<?xml version='1.0' encoding='utf-8'?> <oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"> <dc:creator>Chetty, Raj</dc:creator> <dc:creator>Friedman, John N.</dc:creator> <dc:date>2019-10-08</dc:date> <dc:description>We develop a simple method to reduce privacy loss when disclosing statistics such as OLS regression estimates based on samples with small numbers of observations. We focus on the case where the dataset can be broken into many groups ("cells") and one is interested in releasing statistics for one or more of these cells. Building on ideas from the differential privacy literature, we add noise to the statistic of interest in proportion to the statistic's maximum observed sensitivity, defined as the maximum change in the statistic from adding or removing a single observation across all the cells in the data. Intuitively, our approach permits the release of statistics in arbitrarily small samples by adding sufficient noise to the estimates to protect privacy. Although our method does not offer a formal privacy guarantee, it generally outperforms widely used methods of disclosure limitation such as count-based cell suppression both in terms of privacy loss and statistical bias. We illustrate how the method can be implemented by discussing how it was used to release estimates of social mobility by Census tract in the Opportunity Atlas. We also provide a step-by-step guide and illustrative Stata code to implement our approach. Version of record for "A Practical Method to Reduce Privacy Loss when Disclosing Statistics Based on Small Samples" appears in Journal of Privacy and Confidentality 2019 Vol 9 Issue 2 (doi.org/10.29012/jpc.716). Based on https://github.com/Opportunitylab/Differential-Privacy/releases/tag/v1.0. </dc:description> <dc:description>We are grateful for funding from the Chan-Zuckerberg Initiative, the Bill and Melinda Gates Foundation, the Overdeck Foundation, and Harvard University.</dc:description> <dc:identifier>https://zenodo.org/record/3476957</dc:identifier> <dc:identifier>10.5281/zenodo.3476957</dc:identifier> <dc:identifier>oai:zenodo.org:3476957</dc:identifier> <dc:relation>url:https://github.com/journalprivacyconfidentiality/Differential-Privacy/tree/V1.0.716.1</dc:relation> <dc:relation>url:https://github.com/Opportunitylab/Differential-Privacy/releases/tag/v1.0</dc:relation> <dc:relation>doi:10.29012/jpc.716</dc:relation> <dc:relation>doi:10.5281/zenodo.3476956</dc:relation> <dc:relation>url:https://zenodo.org/communities/jpc</dc:relation> <dc:rights>info:eu-repo/semantics/openAccess</dc:rights> <dc:rights>https://creativecommons.org/licenses/by/4.0/legalcode</dc:rights> <dc:source>Journal of Privacy and Confidentiality 9(2)</dc:source> <dc:subject>formal privacy</dc:subject> <dc:subject>privacy loss</dc:subject> <dc:subject>statistical bias</dc:subject> <dc:title>Replication Package for "A Practical Method to Reduce Privacy Loss when Disclosing Statistics Based on Small Samples"</dc:title> <dc:type>info:eu-repo/semantics/other</dc:type> <dc:type>software</dc:type> </oai_dc:dc>