Abstract

The Education section of this issue presents the article “Bayes Meets Krylov: Statistically Inspired Preconditioners for CGLS,” by D. Calvetti, F. Pitolli, E. Somersalo, and B. Vantaggi. The conjugate gradient method is one of the most popular numerical methods of unconstrained optimization. It was introduced in the seminal paper of Hestenes and Stiefel published in 1952 as a numerical method for solving systems of linear equations with a positive definite matrix. Since then, many scientists have proposed modifications and improvements of the method and have analyzed its convergence. In its version for solving systems of equations, the conjugate gradient method solves a least squares problem for fitting a linear model to data, hence the abbreviation CGLS. We consider a system of the form $b=Ax+\epsilon$, where $x$ is to be determined and $\epsilon$ is the model error or noise. Several difficulties may be associated with solving that problem. When the null space of $A$ is nontrivial, multiple solutions exist, and in such a case we may want to choose a solution with particular properties. Another difficulty appears when $A$ has a very large condition number, which means that even small noise may introduce a large error to the solution. The authors discuss how these two issues may be addressed, so that the iterative solvers promote informative solutions with possibly substantial components in the null space of $A$, and at the same time the effect of the noise is controlled. First, the authors explain Tikhonov-type regularization, a widely used approach to replacing an ill-conditioned problem by a well-posed problem, which is close in some sense. The regularization augments the objective in the least squares problem by a penalty term for the norm of the solution. The authors provide a nice introduction to the ideas behind the computationally attractive techniques in this context. The relation of the Tikhonov regularization method to the maximum a posterior (MAP) solution of linear models in Bayesian statistics is highlighted. Assuming that the noise has a multivariate normal distribution with zero expectation and a certain covariance matrix, the posterior density of $x$ conditioned on the observed value of $b$ is estimated according to the Bayes formula. The authors show that the MAP estimate of $x$ coincides with the least squares solution of a regularized problem involving the covariance matrices. In this context, a preconditioned conjugate gradient method leads to a modification of the estimation problem. Usually, preconditioners have the purpose of improving convergence properties of the numerical method. In the case of ill-posed problems, the authors have set a different goal: the preconditioner should affect the properties of the solution rather than the properties of the matrix $A$. The standard CGLS method computes a solution which is orthogonal to the null space of $A$. However, this orthogonality assumption may not be justified within the Bayesian framework, and it may be undesirable, as it results in loss of information. The authors advocate a way to use information provided by the priors in the Bayesian framework. They observe that a preconditioner for the covariance matrix of the prior distribution of $x$ changes the spectral properties of the underlying normal equations so that the modified matrix has a wider range of nonzero eigenvalues. The inevitable effect is a slowdown of the convergence rate. The authors state, however, that for an ill-posed problem, slower convergence should not be viewed as a disadvantage, but rather as a small price for constructing a richer approximation of the unknown. The presentation is supplemented by illustrative examples, from a simple one-dimensional problem to more complex tomography examples with sparse data and reconstruction of a subsurface function in a geological example. The authors conclude with a discussion about possible extensions of the presented methodology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call