Abstract

Hierarchical neural network based one-class anomaly detection algorithms generally rely on stacked autoencoders (AEs) for feature learning. But existing AEs are not specifically designed to exploit the discriminative characteristics of target data in one-class classification (OCC), and thus may lead to poor generalization performance. In this brief, a novel randomized AE that imposes the constraint of the within-class scatter information is developed in feature learning. The correntropy criterion is applied to replace the mean square error criterion (MSE) to enhance the algorithm performance in outlier and noise rejection. The algorithm is further extended to kernel learning to improve its generalization capability. Experiments on benchmark datasets are carried out to show the effectiveness of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call