Abstract
We propose a nonparametric procedure to achieve fast inference in generative graphical models when the number of latent states is very large. The approach is based on iterative latent variable preselection, where we alternate between learning a selection function to reveal the relevant latent variables and using this to obtain a compact approximation of the posterior distribution for EM. This can make inference possible where the number of possible latent states is, for example, exponential in the number of latent variables, whereas an exact approach would be computationally infeasible. We learn the selection function entirely from the observed data and current expectation-maximization state via gaussian process regression. This is in contrast to earlier approaches, where selection functions were manually designed for each problem setting. We show that our approach performs as well as these bespoke selection functions on a wide variety of inference problems. In particular, for the challenging case of a hierarchical model for object localization with occlusion, we achieve results that match a customized state-of-the-art selection method at a far lower computational cost.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.