Abstract

Total least squares (TLS) method has been widely used in errors-in-variables (EIV) modeling tasks in which both input and output data are disturbed by noises, and adaptive filtering algorithm using TLS has shown significantly superior performance to classical least squares (LS) method in EIV system. The TLS essentially extends the minimum mean square error (MMSE) criterion to EIV model, which, however, may work poorly when noise is non-Gaussian (especially heavy-tailed distribution). Recently, an information theoretic learning (ITL) based minimum total error entropy (MTEE) adaptive filtering algorithm has been proposed, which extends the minimum error entropy (MEE) criterion to EIV model and shows desirable performance in non-Gaussian noise environments. However, due to complex mathematical expression, MTEE is computationally expensive and difficult to carry out the theoretical analysis. In this paper, we propose a new ITL-based criterion called maximum total correntropy (MTC) and develop a gradient-based MTC adaptive filtering algorithm. We analyze theoretically the local stability and steady-state performance of the proposed algorithm. Simulation results confirm the theoretical analysis and show the superior performance of MTC in heavy-tailed noises. Further, simulation comparisons between MTC and MTEE are presented. Compared with the MTEE, the MTC is mathematically more tractable and computationally much simpler while achieving similar or even better performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call