Abstract

Empirical economic research crucially relies on highly sensitive individual datasets. At the same time, increasing availability of public individual-level data that comes from social networks, public government records and directories makes it possible for adversaries to potentially de-identify anonymized records in sensitive research datasets. This increasing disclosure risk has incentivised large data curators, most notably the US Census bureau and several large companies including Apple, Facebook and Microsoft to look for algorithmic solutions to provide formal non-disclosure guarantees for their secure data. The most commonly accepted formal data security concept in the Computer Science community is referred to as differential privacy. Differential privacy restricts the interaction of the researcher with the data by allowing her to issue queries that evaluate the functions of the data. The differential privacy mechanism then replaces the actual outcome of the query with a randomised outcome with the amount of randomness determined by the sensitivity of the outcome to individual observations in the data. While differential privacy does provide formal data security guarantees, its impact on the identification of empirical economic models as well as on the performance of estimators in nonlinear empirical Econometric models has not been sufficiently studied. Since privacy protection mechanisms are inherently finite-sample procedures, we define the notion of identifiability of the parameter of interest as a property of the limit of experiments. It is naturally characterized by concepts from the random sets theory and is linked to the asymptotic behavior in measure of differentially private estimators. We demonstrate that particular instances of regression discontinuity design and average treatment effect may be problematic for inference with differential privacy. Under differential privacy their estimators can only be ensured to converge weakly with their asymptotic limit remaining random and, thus, may not be estimated consistently.This result is clearly supported by our simulation evidence. Our analysis suggests that many other estimators that rely on nuisance parameters may have similar properties with the requirement of differential privacy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.