Abstract

Optimal experiment design (OED) aims to optimize the information content of experimental observations by designing the experimental conditions. In Bayesian OED for parameter estimation, the design selection is based on an expected utility metric that accounts for the joint probability distribution of the uncertain parameters and the observations. This work presents solution methods for two approximate formulations of the Bayesian OED problem based on Kullback–Leibler divergence for the particular case of Gaussian prior and observation noise distributions and the general case of arbitrary prior distributions and arbitrary observation noise distributions when the observation noise corresponds to arbitrary functions of the states and random variables with an arbitrary multivariate distribution. The proposed methods also allow satisfying path constraints with a specified probability. The solution approach relies on the reformulation of the approximate Bayesian OED problem as an optimal control problem (OCP), for which a parsimonious input parameterization is adopted to reduce the number of decision variables. An efficient global solution method for OCPs via sum-of-squares polynomials and parallel computing is then applied, which is based on approximating the cost of the OCP by a polynomial function of the decision variables and solving the resulting polynomial optimization problem to global optimality in a tractable way via semidefinite programming. It is established that the difference between the cost obtained by solving the polynomial optimization problem and the globally optimal cost of the OCP is bounded and depends on the polynomial approximation error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call