Abstract

One-class classification achieves anomaly/outlier detection by exploiting the characteristics of target data. As a local similarity measure defined in kernel space, correntropy is generally more robust than the mean square error (MSE) based criterion in dealing with large outliers. In this article, the maximum mixture correntropy criterion (MMCC) with multiple kernels are applied to the shallow and hierarchical one-class extreme learning machine to enhance the model robustness and learning speed. Experiments on benchmark University of California, Irvine (UCI) classification datasets, urban acoustic classification dataset, and four synthetic datasets are carried out to show the effectiveness and comparisons with several state-of-the-art methods are provided to demonstrate the superiority of the proposed algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call