Abstract

Using differential privacy to provide privacy protection for classification algorithms has become a research hotspot in data mining. In this paper, we analyze the defects in the differentially private decision tree named Maxtree, and propose an improved model DPtree. DPtree can use the Fayyad theorem to process continuous features quickly, and can adjust privacy budget adaptively according to sample category distributions in leaf nodes. Moreover, to overcome the inevitable decline of classification ability of differentially private decision trees, we propose an ensemble learning model for DPtree, namely En-DPtree. In the voting process of En-DPtree, we propose a multi-population quantum genetic algorithm, and introduce immigration operators and elite groups to search the optimal weights for base classifiers. Experiments show that the performance of DPtree is better than Maxtree, and En-DPtree is always superior to other competitive algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call