Abstract

This paper is a sequel of our earlier development of state-space recursive least squares (SSRLS). Stability and convergence analysis of SSRLS and its steady-state counterpart complete the theoretical framework of this new powerful algorithm. Upper bounds on the forgetting factor that ensure stability of the filter have been derived. Properties like unbiasedness, convergence in mean, mean-square deviation and learning curves have been investigated. The results prove that SSRLS with infinite memory is optimal linear estimator for deterministic signals. The expressions and derivations show that different convergence properties of SSRLS are related to some forms of discrete Riccati or Lyapunov equations. The analysis done in this work would help understand the intricate details and behavior of SSRLS, which could subsequently aid in advanced applications and any further development.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.