Abstract

We present a recursive total least squares (RTLS) algorithm for multilayer feedforward neural networks. So far, recursive least squares (RLS) has been successfully applied to training multilayer feedforward neural networks. If the input data contains additive noise, the results from RLS could be biased. Such biased results can be avoided by using the RTLS algorithm. The RTLS algorithm described in this paper performs better than RLS algorithm over a wide range of SNRs and involves approximately the same computational complexity of O(N2) as the RLS algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call