Abstract

Classification as a supervised machine learning method predicts the class label from the features distribution and the training process. The traditional classifier algorithms are not robust in the presence of outliers and noisy features. In this study, we suggest a novel robust classifier using kernel recursive least lncosh (RC-KRLL), in which the clipping concept is used to ignore the effect of large noises. Instead of the conventional mean square error cost function, the suggested RC-KRLL method is derived from the lncosh loss function, being more suitable for non-Gaussian noise. The mean, mean-square convergence, and learning curve of the RC-KRLL are discussed theoretically. The simulation results show that the proposed method outperforms the other robust state-of-the-art classification algorithms in the presence of synthetic non-Gaussian noise over UCI data sets, such as the mixture of Gaussian noise with different variances, impulse, and Alpha–Beta noises. The obtained results over 500px data sets on social media with real outliers and mislabeled samples confirmed the acceptable performance of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call