Abstract

AbstractWe consider parameter estimation problems involving a set of m physical observations, where an unknown vector of n parameters is defined as the solution of a nonlinear least‐squares problem. We assume that the problem is regularized by a quadratic penalty term. When solution techniques based on successive linearization are considered, as in the incremental four‐dimensional variational (4D‐Var) techniques for data assimilation, a sequence of linear systems with particular structure has to be solved. We exhibit a subspace of dimension m that contains the solution of these linear systems, and derive a variant of the conjugate gradient algorithm that is more efficient in terms of memory and computational costs than its standard form, when m is smaller than n. The new algorithm, which we call the Restricted Preconditioned Conjugate Gradient (RPCG), can be viewed as an alternative to the so‐called Physical‐space Statistical Analysis System (PSAS) algorithm, which is another approach to solve the linear problem. In addition, we show that the non‐monotone and somehow chaotic behaviour of PSAS algorithm when viewed in the model space, experimentally reported by some authors, can be fully suppressed in RPCG.Moreover, since preconditioning and re‐orthogonalization of residuals vectors are often used in practice to accelerate convergence in high‐dimension data assimilation, we show how to reformulate these techniques within subspaces of dimension m in RPCG. Numerical experiments are reported, on an idealized data assimilation system based on the heat equation, that clearly show the effectiveness of our algorithm for large‐scale problems. Copyright © 2009 Royal Meteorological Society

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call