Abstract

Feature selection is a data processing method that is used to select a few important features among many input features and to remove any irrelevant one. Although feature selection in classification problems has been the focus of much research, few feature selection methods are available for use in one-class classification problems (i.e., anomaly detection). In particular, existing feature selection methods cannot be applied for the feature selection of the one-class classification problem when there are no available observations for the anomaly (or the second class). In this study, we propose two support vector data description (SVDD)-based feature selection methods: SVDD-radius-recursive feature elimination (RFE) and SVDD dual-objective RFE. The SVDD-radius-RFE method can be used to minimize the size of the boundary of describing normal observations measured through the value of its radius squared and the SVDD-dual-objective-RFE method can be applied to obtain a compact description in the dual space of SVDD. Experimental results using both simulated and real-life datasets demonstrate that the proposed methods show the improved performance compared with existing support vector machine RFE methods even for the classification problems when available observations for the anomaly are few.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call