Abstract

Kernel adaptive filtering (KAF) is an effective nonlinear learning algorithm, which has been widely used in time series prediction. The traditional KAF is based on the stochastic gradient descent (SGD) method, which has slow convergence speed and low filtering accuracy. Hence, a kernel conjugate gradient (KCG) algorithm has been proposed with low computational complexity, while achieving comparable performance to some KAF algorithms, e.g., the kernel recursive least squares (KRLS). However, the robust learning performance is unsatisfactory, when using KCG. Meanwhile, correntropy as a local similarity measure defined in kernel space, can address large outliers in robust signal processing. On the basis of correntropy, the mixture correntropy is developed, which uses the mixture of two Gaussian functions as a kernel function to further improve the learning performance. Accordingly, this article proposes a novel KCG algorithm, named the kernel mixture correntropy conjugate gradient (KMCCG), with the help of the mixture correntropy criterion (MCC). The proposed algorithm has less computational complexity and can achieve better performance in non-Gaussian noise environments. To further control the growing radial basis function (RBF) network in this algorithm, we also use a simple sparsification criterion based on the angle between elements in the reproducing kernel Hilbert space (RKHS). The prediction simulation results on a synthetic chaotic time series and a real benchmark dataset show that the proposed algorithm can achieve better computational performance. In addition, the proposed algorithm is also successfully applied to the practical tasks of malware prediction in the field of malware analysis. The results demonstrate that our proposed algorithm not only has a short training time, but also can achieve high prediction accuracy.

Highlights

  • Traditional time series prediction methods mainly include autoregression, Kalman filtering, and moving average models

  • The experiments on short-term predictions of the Mackey–Glass chaotic time series, minimum daily temperatures time series, and the real-world malware application programming interface (API) call sequence are conducted to illustrate the performance of our proposed algorithm

  • The algorithm kernel mixture correntropy conjugate gradient (KMCCG) was compared with the quantized kernel least mean squares (KLMS) (QKLMS) algorithm [34], the quantized kernel maximum correntropy (QKMC) algorithm [35], and the kernel maximum mixture correntropy (KMMCC) algorithm [15] in four different noise environments, to verify the performance of our proposed algorithm

Read more

Summary

Introduction

Traditional time series prediction methods mainly include autoregression, Kalman filtering, and moving average models These traditional approaches focus on mathematical statistics and have no capabilities of self-learning, self-organization, and self-adaption. Motivated by the work mentioned above, in order to improve the filtering accuracy, convergence speed, and robustness against impulse noise at the same time, the mixture correntropy criterion (MCC) [11] is applied to KCG method. A novel kernel learning algorithm, called kernel mixture correntropy conjugate gradient (KMCCG), is proposed in this article. (1) On the basis of mixture correntropy, a novel robust algorithm KMCCG is proposed through a comprehensive use of the half-quadratic optimization method, the CG technique, and the kernel trick.

Mixture Correntropy
Kernel Conjugate Gradient Algorithm
Sparsification Criterion
Half-Quadratic Optimization of the Mixture Correntropy
Kernel Mixture Correntropy Conjugate Gradient Algorithm
Mackey–Glass Time Series Prediction
Minimum Daily Temperatures Time Series Prediction
Malware API Call Sequence Prediction
Background
Experimental Result
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call