Abstract

AbstractKernel‐risk‐sensitive loss (KRSL) achieves an efficient performance surface, which has been applied in the kernel adaptive filters (KAFs) successfully. However, the KRSL based KAFs use the stochastic gradient descent (SGD) method in the optimization, which usually suffer from inadequate accuracy with the slow convergence speed. In this letter, the conjugate gradient method is adopted in the optimization of KRSL function, and the problem of non‐convexity in KRSL is addressed by twice half‐quadratic (HQ) methods. For sparsification, a novel Student's‐t distribution based random Fourier feature (St‐RFF) method for performance improvement of the conventional RFF method. Thus, a novel Student's‐t distribution based random Fourier features kernel‐risk‐sensitive conjugate gradient (St‐RFFKRSCG) algorithm is proposed. Simulations on Mackey‐Glass time series prediction under non‐Gaussian noises confirm the superiorities in terms of accuracy performance, robustness, and computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call