Abstract

One of the most significant issues in machine learning is system identification with many applications, e.g., channel estimation (CE) in digital communications. Introducing a new correntropy-based method, this paper deals with the comparison between mean square error (MSE) and information theoretic measures in non-Gaussian noise channel estimation, by analyzing the MSE, minimum error entropy (MEE) and correntropy algorithms in several channel models utilizing neural networks. The first contribution of this paper is introducing a new correntropy-based conjugate gradient (CCG) method and applying it in the CE problem, which this new algorithm converges faster than standard maximum correntropy criterion algorithm. Aiming at this contribution, the better convergence rate is discussed analytically and it is proved that the CCG could converge to the optimal solution quadratically. Next, the performance of an extended MSE algorithm is compared with information theoretic criteria; in addition, a comparison between MEE and correntropy-based algorithm is presented. The Monte Carlo results illustrate that correntropy and MEE outperform MSE algorithm in low-SNR communications especially in the presence of impulsive noise. Then, we apply the trained neural networks in the receiver as an equalizer to obtain the intended performance for different SNR values.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call