Abstract

Helmholtz machine(HM) is the classic hierarchical probabilistic model for building the probability distribution of perception data, and the wake-sleep(WS) algorithm has been widely used as a training algorithm. To prevent the attacker from restoring the training set data by using the trained HM model, we introduce a Gaussian mechanism to WS algorithm to propose wake-sleep algorithm based on differential privacy (DP-WS) and use DP-WS to train HM to get HM model with privacy protection, named DP-HM. We provide rigorous proof of the privacy guarantee. In addition, our experiments on MNIST and BioID face datasets show that DP-HM model can be trained under a modest privacy budget and still have acceptable model quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call