Abstract

The estimation algorithm developed offers an alternative to standard recursive nonlinear estimators such as the extended Kalman filter and the iterated extended Kalman filter. The algorithm, which is developed from a quadratic cost function basis, splits the problem of cost function minimization into a linear first step and a nonlinear second step by defining new first-step states that are nonlinear combinations of the unknown states. Estimates of the firststep states are obtained by minimizing the first-step cost function using a Kalman filter formulation. Estimates of the unknown, or second-step, states are obtained by minimizing the second-step cost function using an iterative Gauss-Newton algorithm. The two-step estimator is shown to be optimal for static problems in which the time variation of the measurement equation can be separated from the unknowns. This method is then generalized by approximating the nonlinearity as a perturbation of the dynamic update, while keeping the measurement cost function the same. In contrast, the extended Kalman filter and the iterated extended Kalman filter linearize the measurement cost function, resulting in suboptimal estimates. Two example applications confirm these analytical results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call