Abstract

Renyi’s entropy, as a crucial similarity measure of information quantity, is the cornerstone of the minimum error entropy (MEE) criterion. However, the conventional MEE cost had to employ a computationally feasible estimator to approximate the error entropy directly from the data samples, which could be contaminated by large interference outliers, and the performance of MEE could deteriorate in the presence of complex non-Gaussian noises. To address this problem, utilizing the robust statistical properties of M-estimation, the M-estimators are introduced to the MEE criterion by down-weighting or discarding large error residuals; we can then proceed to obtain a more accurate Renyi’s-entropy estimator compared to the empirical form under the recalculated error distribution. This innovative approach could be simply implemented while ensuring great robustness. Therefore, a robust adaption criterion called the M-estimation-based minimum error entropy (MMEE), and its corresponding adaptive filtering algorithm are proposed that detect and bound the influence of outliers in non-Gaussian noise. In addition, the mean stability and steady-state performance of the proposed MMEE are evaluated, furthermore, we theoretically prove that the MMEE algorithm could achieve lower steady-state error than MEE under the same conditions. Simulation results verify our theoretical predictions, and demonstrate the superiority and robustness of the MMEE algorithm compared with several existing robust filtering algorithms for restraining various non-Gaussian noises.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call