Abstract
With the development of data release and data mining, protecting the sensitive information in the data from being leaked has attracted quite a few attentions in the information security field. Differential privacy is an excellent paradigm for providing the preservation against the adversary that attempts to infer the sensitive information of individuals. However, the existing works show that the accuracy of the differentially private regression model is less than satisfactory since the amount of noise added in is uncertainty. In this paper, we present a novel framework PrivR, a differentially private regression analysis model based on relevance, which transforms the objective function into the form of polynomial and perturbs the polynomial coefficients according to the magnitude of relevance between the input features and the model output. Specifically, we add less noise to the coefficients of the polynomial representation of the objective function that involve strongly relevant features, and vice-versa. Experiments on Adult dataset and Banking dataset demonstrate that PrivR not only prevents the leakage of data privacy effectively but also retains the utility of the model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.