Abstract

The numerical properties of implementations of the recursive least-squares identification algorithm are of great importance for their continuous use in various adaptive schemes. Here we investigate how an error that is introduced at an arbitrary point in the algorithm propagates. It is shown that conventional LS algorithms, including Bierman's UD-factorization algorithm are exponentially stable with respect to such errors, i.e. the effect of the error decays exponentially. The base of the decay is equal to the forgetting factor. The same is true for fast lattice algorithms. The fast least-squares algorithm, sometimes known as the ‘fast Kalman algorithm’ is however shown to be unstable with respect to such errors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call