Abstract

The recent correntropy criterion based semi-supervised random neural network extreme learning machine (RC-SSELM) achieved outstanding performance in dealing with datasets with large outliers and non-Gaussian noises. To further improve the effectiveness and flexibility of the algorithm in combating large and complex outliers, we explore a more effective semi-supervised ELM data learning algorithm with the robust maximum mixture correntropy criterion (MMCC) based optimization scheme in this brief. Meanwhile, the generalized correntropy criterion kernel function with the variable kernel center is applied to MMCC, and the resultant novel semi-supervised learning algorithm is abbreviated as MC-SSELM <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">vc</sup> . The fixed-point iteration learning algorithm is adopted for the output weight optimization of MC-SSELM <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">vc</sup> . Experiment conducted on many benchmark datasets are given to show the effectiveness of MC-SSELM <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">vc</sup> and comparisons to several state-of-the-art semi-supervised learning algorithms are provided for the superiority demonstration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call