Abstract

Most of the cost functions of adaptive filtering algorithms include the square error, which depends on the current error signal. When the additive noise is impulsive, we can expect that the square error will be very large. By contrast, the cross error, which is the correlation of the error signal and its delay, may be very small. Based on this fact, we propose a new cost function called the mean square cross error for adaptive filters, and provide the mean value and mean square performance analysis in detail. Furthermore, we present a two-stage method to estimate the closed-form solutions for the proposed method, and generalize the two-stage method to estimate the closed-form solution of the information theoretic learning methods, including least mean fourth, maximum correntropy criterion, generalized maximum correntropy criterion, and minimum kernel risk-sensitive loss. The simulations of the adaptive solutions and closed-form solution show the effectivity of the new method.

Highlights

  • The mean square error (MSE) is probably the most widely used cost function for adaptive linear filters [1,2,3,4,5]

  • In our early work [25,26,27], we proposed the mean square cross prediction error to extract the desired signal in blind source separation (BSS), where the square cross prediction error was much smaller than the square prediction error

  • Simulations demonstrated that the proposed mean square CE (MSCE) and generalized MSCE (GMSCE) algorithm may perform better than the MSE algorithm both for sub and super-Gaussian noise

Read more

Summary

Introduction

The mean square error (MSE) is probably the most widely used cost function for adaptive linear filters [1,2,3,4,5]. From the viewpoint of performance, for example, the steady error, the MSE, MCC, and LMF work well for Gaussian, super-Gaussian, and sub-Gaussian noise, respectively. We propose a new cost function called the mean square CE (MSCE) for adaptive filtering to process non-Gaussian noise. We present a two-stage method to estimate the closed-form solutions for the LMF, MCC, GMCC, MKRSL, and MSCE. Iii) We generalize the two-stage method to estimate the closed-form solution of the LMF MCC, GMCC, and MKRSL algorithms. The linear filtering algorithms of the MSE, MCC, and LMF are as follows: the cost function based on the MSE is given by JMSEðwÞ 1⁄4 EÈe2É; ð7Þ where E denotes the expectation operator.

Methods
Closed-form solution of the MCC
Closed-form solution of the MKRSL
Mean value behavior
Simulation results and discussion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call