Abstract

One of the challenges of implementing differential data privacy, is that the utility (usefulness) of the privatized data tends to diminish even as confidentiality is guaranteed. In such settings, due to excessive noise, original data suffers loss of statistical significance despite the strong levels of confidentiality assured by differential privacy. This in turn makes the privatized data practically valueless to the consumer of the published data. Additionally, researchers have noted that finding equilibrium between data privacy and utility requirements remains intractable, necessitating trade- offs. Therefore, as a contribution, we propose using the moving average filtering model for non-interactive differential privacy settings. In this model, various levels of differential privacy (DP) are applied to a data set, generating a variety of privatized data sets. The privatized data is passed through a moving average filter and the new filtered privatized data sets that meet a set utility threshold are finally published. Preliminary results from this study show that adjustment of ɛ epsilon parameter in the differential privacy process, and the application of the moving average filter might generate better data utility output while conserving privacy in non-interactive differential privacy settings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call