Abstract

In the approximate inference of Bayesian neural networks (BNNs), the variational posterior distribution is often taken an exponential family form (such as Gaussian). We propose to make the mixtures of exponential family distributions instead to get a more flexible approximation posterior. A novel reparameterization trick is introduced in this paper, in order to apply the reparameterization trick to mixed density distributions in Alpha divergence minimization. Our method is extendable to various neural architectures such as fully-connected neural networks and convolutional neural networks. The analysis on time complexity demonstrates that our method has less computation-consuming than normalizing flows. It also outperforms some related state-of-the-art techniques in the experiments of uncertainty estimation and robustness against adversarial examples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.