Abstract

In the context of channel estimation amid non-Gaussian impulse noise, traditional non-kernel-space methods face challenges of divergence, while many kernel-space methods fail to fully exploit the a priori information embedded in the channel. To address this, we introduce a robust sparse recursive adaptive filtering algorithm named convex regularized recursive kernel risk-sensitive loss (CR-RKRSL) in this paper. By combining the KRSL with a convex function constraint term, our proposed algorithm maximizes the utilization of channel a priori information. Furthermore, we delve into the theoretical aspects of the proposed algorithm, presenting expressions for the convergence and steady-state error. Through extensive simulation results, we demonstrate that CR-RKRSL outperforms the APSA, LHCAF, PRMCC, CR-RMC, RZAMCC algorithms. In comparison to existing algorithms, CR-RKRSL exhibits superior robustness and faster convergence, particularly in scenarios involving highly sparse systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.