Abstract
AbstractBayesian optimization (BO) has been a successful approach to optimize expensive functions whose prior knowledge can be specified by means of a probabilistic model. Due to their expressiveness and tractable closed-form predictive distributions, Gaussian process (GP) surrogate models have been the default go-to choice when deriving BO frameworks. However, as nonparametric models, GPs offer very little in terms of interpretability and informative power when applied to model complex physical phenomena in scientific applications. In addition, the Gaussian assumption also limits the applicability of GPs to problems where the variables of interest may highly deviate from Gaussianity. In this article, we investigate an alternative modeling framework for BO which makes use of sequential Monte Carlo (SMC) to perform Bayesian inference with parametric models. We propose a BO algorithm to take advantage of SMC’s flexible posterior representations and provide methods to compensate for bias in the approximations and reduce particle degeneracy. Experimental results on simulated engineering applications in detecting water leaks and contaminant source localization are presented showing performance improvements over GP-based BO approaches.
Highlights
Impact Statement The methodology we present in this article can be applied to a wide range of problems involving sequential decision making
The sequential Monte Carlo (SMC)-UCB algorithm In summary, the method we propose is described in Algorithm 3, which we refer to as SMC upper confidence bound (SMC-UCB)
This article presented SMC as an alternative to Gaussian process (GP) based approaches for Bayesian optimization when domain knowledge is available in the form of informative computational models
Summary
Impact Statement The methodology we present in this article can be applied to a wide range of problems involving sequential decision making. Nonparametric models are usually the best approach for problems with scarce prior information, they offer little in terms of interpretability and may be a suboptimal guide when compared to expert parametric models. Bayesian inference on complex parametric models, is usually intractable, requiring the use of sampling-based techniques, like Markov chain Monte Carlo (MCMC) (Andrieu et al, 2003), or variational inference methods (Bishop, 2006; Ranganath et al, 2014). Either of these approaches can lead to high computational overheads during the posterior updates in a BO loop. Experimental results demonstrate practical performance improvements over GP-based BO approaches
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.