Abstract

The classical Kalman smoother recursively estimates states over a finite time window using all observations in the window. In this paper, we assume that the parameters characterizing the second-order statistics of process and observation noise are unknown and propose an optimal Bayesian Kalman smoother (OBKS) to obtain smoothed estimates that are optimal relative to the posterior distribution of the unknown noise parameters. The method uses a Bayesian innovation process and a posterior-based Bayesian orthogonality principle. The optimal Bayesian Kalman smoother possesses the same forward-backward structure as that of the ordinary Kalman smoother with the ordinary noise statistics replaced by their effective counterparts. In the first step, the posterior effective noise statistics are computed. Then, using the obtained effective noise statistics, the optimal Bayesian Kalman filter is run in the forward direction over the window of observations. The Bayesian smoothed estimates are obtained in the backward step. We validate the performance of the proposed robust smoother in the target tracking and gene regulatory network inference problems.

Highlights

  • Classical Kalman filtering is defined via a set of equations that provide a recursive evaluation of the optimal linear filter output to incorporate new observations [1]

  • We assume that the parameters characterizing the second-order statistics of process and observation noise are unknown and propose an optimal Bayesian Kalman smoother (OBKS) framework to obtain smoothed estimates that are optimal relative to the posterior distribution of the unknown noise parameters

  • In an additional forward step, first the posterior effective noise statistics are computed, these effective characteristics are used in another forward step to run the optimal Bayesian Kalman filter (OBKF), and in the backward step, the Bayesian smoothed estimate for each state in the interval is computed as summarized in the backward step of Table 1

Read more

Summary

Introduction

Classical Kalman filtering is defined via a set of equations that provide a recursive evaluation of the optimal linear filter output to incorporate new observations [1]. The aim can be achieved by replacing model characteristics and statistics in the solution to the nominal problem with their effective counterparts, which incorporate model uncertainty in such a way that the equation structure of the nominal solution is essentially preserved in the Bayesian robust solution This approach has been used for classification [19], linear and morphological filtering [15, 17], signal compression [20], and Kalman filtering [16]. We assume that the parameters characterizing the second-order statistics of process and observation noise are unknown and propose an optimal Bayesian Kalman smoother (OBKS) framework to obtain smoothed estimates that are optimal relative to the posterior distribution of the unknown noise parameters. N (x; μ, ) denotes a multivariate Gaussian function relative to random vector x with the mean vector μ and the covariance matrix

Optimal Bayesian Kalman smoother
Update equation for Bayesian smoothed estimate
Update equation for the Bayesian smoothed error covariance matrix
Computing posterior effective noise statistics
Target tracking example
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.