Abstract

Sparse Bayesian learning (SBL) has attracted substantial interest in recent years for reliable estimation of sparse parameter vectors of dimension much larger than the number of measurements. However, the theory of online sequential estimation of sparsely changing parameter vectors is much less studied. We present a sequential SBL framework for recursive learning of sparse vectors that also change sparsely between successive sampling time periods. Our method uses a hierarchical Bayesian model to recursively estimate the marginal posterior distribution of the parameter vector for each time period, incorporating the sparseness of both this vector and its temporal changes. Our Bayesian model is built around a linear Gaussian state space model and so many quantities of interest can be calculated by using the recursive Bayesian equations. The fast evidence maximization procedure for SBL is developed for recursive Bayesian analysis and the “noise” parameters are efficiently learned solely from the available data in an efficient manner. Numerical experiments verify that exploiting the sparseness of temporal changes of sparse vectors leads to better performance of sparse Bayesian learning. We also examine two applications of sequential SBL: structural system identification for estimating stiffness losses of sequential damage states and recursive reconstruction of image sequences. These illustrative applications validate the effectiveness and robustness of our method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.