Abstract

Generally, colliding parties who have private data may conduct privacy-preserving data analysis (PPDA) tasks to learn beneficial data models in a distributed manner. The field of privacy has seen rapid advances in recent years because of the increases in the ability to store data. In particular, recent advances in the data mining field have lead to increased concerns about privacy. While the topic of privacy has been traditionally studied in the context of cryptography and information-hiding, recent emphasis on data mining has lead to renewed interest in the field. In this paper, we will introduce the topic of privacy-preserving data mining. It is often highly valuable for organizations to have their data analyzed by external agents. However, any program that computes on potentially sensitive data may lead to risks leaking information through its output. Differential privacy provides a theoretical framework for processing data while protecting the privacy of individual records in a dataset. Unfortunately, it has seen limited adoption because of the loss in output accuracy, the difficulty in making programs differentially private, lack of mechanisms to describe the privacy budget in a programmer's utilitarian terms. So, in this paper we have proposed how to share private data securely.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call