Abstract

With the development of data release and data mining, protecting the sensitive information in the data from being leaked has attracted quite a few attentions in the information security field. Differential privacy is an excellent paradigm for providing the preservation against the adversary that attempts to infer the sensitive information of individuals. However, the existing works show that the accuracy of the differentially private regression model is less than satisfactory since the amount of noise added in is uncertainty. In this paper, we present a novel framework PrivR, a differentially private regression analysis model based on relevance, which transforms the objective function into the form of polynomial and perturbs the polynomial coefficients according to the magnitude of relevance between the input features and the model output. Specifically, we add less noise to the coefficients of the polynomial representation of the objective function that involve strongly relevant features, and vice-versa. Experiments on Adult dataset and Banking dataset demonstrate that PrivR not only prevents the leakage of data privacy effectively but also retains the utility of the model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call