Abstract

AbstractParticle filters are fully nonlinear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high‐dimensional geophysical applications, the number of particles required by the sequential importance resampling (SIR) particle filter (in order to capture the high‐probability region of the posterior) is too large to make them usable. However particle filters can be formulated using proposal densities, which give greater freedom in how particles are sampled and allow for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high‐probability region of the posterior probability density function. This gives rise to the possibility of nonlinear data assimilation in large‐dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that, when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model. Copyright © 2012 Royal Meteorological Society

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call