Abstract

Stein variational gradient descent (SVGD) has been shown powerful as a nonparametric variational inference algorithm for general purposes. However, the standard SVGD necessitates calculating the gradient of the target density and therefore cannot be used where the gradient is unavailable or too expensive to evaluate. A gradient-free variant (GF-SVGD) has been proposed to substitute the gradient with a surrogate, however, the major computational challenge of computing the forward model still prohibits the use of SVGD in inferencing complex distributions. In this paper, we address this issue by evaluating the forward model at only a limited number of points and build its approximation using pre-calculated kernels to keep the computational cost as low as possible. Our approximation method is then combined with an adaptation strategy that automatically refines the model by selecting particles at critical local locations, increasing precision at a low cost. We observe significant computational gains over the original SVGD and GF-SVGD algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call