Abstract

In the Bayesian framework, the predictive distribution is obtained by averaging over the posterior parameter distribution. When there is a small amount of data, the uncertainty of the parameters is high. Thus with the predictive distribution, a more reliable result can be obtained in the applications as classification, recognition, etc. In the previous works, we have utilized the variational inference framework to approximate the posterior distribution of the parameters in the beta distribution by minimizing the Kullback-Leibler divergence of the true posterior distribution from the approximating one. However, the predictive distribution of the beta distribution was approximated by a plug-in approximation with the posterior mean, regardless of the parameter uncertainty. In this paper, we carry on the factorized approximation introduced in the previous work and approximate the beta function by its first order Taylor expansion. Then the upper bound of the predictive distribution is derived by exploiting the local variational method. By minimizing the upper bound of the predictive distribution and after normalization, we approximate the predictive distribution by a probability density function in a closed form. Experimental results shows the accuracy and efficiency of the proposed approximation method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.