Abstract

The capability of a novel Kullback–Leibler divergence method is examined herein within the Kalman filter framework to select the input–parameter–state estimation execution with the most plausible results. This identification suffers from the uncertainty related to obtaining different results from different initial parameter set guesses, and the examined approach uses the information gained from the data in going from the prior to the posterior distribution to address the issue. Firstly, the Kalman filter is performed for a number of different initial parameter sets providing the system input–parameter–state estimation. Secondly, the resulting posterior distributions are compared simultaneously to the initial prior distributions using the Kullback–Leibler divergence. Finally, the identification with the least Kullback–Leibler divergence is selected as the one with the most plausible results. Importantly, the method is shown to select the better performed identification in linear, nonlinear, and limited information applications, providing a powerful tool for system monitoring.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call