Abstract

Several algorithms in adaptive filtering are based on the minimization of the mean squared error (MSE) cost function. However, MSE is just a second order statistics and hence does not capture the entire information about the probability distribution of the error in the system. An information theoretic alternative is using the minimum error entropy (MEE) cost function. Adaptive algorithms based on this criterion have been developed and shown to be superior as compared to MSE counterparts. In this work, kernel versions of some of these methods are designed and tested on predicting the annual sunspot number. The sunspot number is the number of visibly darker regions on the solar surface and has been shown to be instrumental in modeling space weather, state of the ionosphere, climatic anomalies and even global warming. A comparative performance study of the various linear and kernel algorithms, trained with both MEE and MSE criteria, in predicting such a chaotic non-linear time series is presented in this paper. Experimental results clearly show the advantage of the MEE based kernel design which is as per expectation given that it has the advantage of being non-linear along with being able to derive maximum information from the error distribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call