Abstract

Using differential privacy to provide privacy protection for classification algorithms has become a research hotspot in data mining. In this paper, we analyze the defects in the differentially private decision tree named Maxtree, and propose an improved model DPtree. DPtree can use the Fayyad theorem to process continuous features quickly, and can adjust privacy budget adaptively according to sample category distributions in leaf nodes. Moreover, to overcome the inevitable decline of classification ability of differentially private decision trees, we propose an ensemble learning model for DPtree, namely En-DPtree. In the voting process of En-DPtree, we propose a multi-population quantum genetic algorithm, and introduce immigration operators and elite groups to search the optimal weights for base classifiers. Experiments show that the performance of DPtree is better than Maxtree, and En-DPtree is always superior to other competitive algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.