Abstract
Feature selection is a data processing method that is used to select a few important features among many input features and to remove any irrelevant one. Although feature selection in classification problems has been the focus of much research, few feature selection methods are available for use in one-class classification problems (i.e., anomaly detection). In particular, existing feature selection methods cannot be applied for the feature selection of the one-class classification problem when there are no available observations for the anomaly (or the second class). In this study, we propose two support vector data description (SVDD)-based feature selection methods: SVDD-radius-recursive feature elimination (RFE) and SVDD dual-objective RFE. The SVDD-radius-RFE method can be used to minimize the size of the boundary of describing normal observations measured through the value of its radius squared and the SVDD-dual-objective-RFE method can be applied to obtain a compact description in the dual space of SVDD. Experimental results using both simulated and real-life datasets demonstrate that the proposed methods show the improved performance compared with existing support vector machine RFE methods even for the classification problems when available observations for the anomaly are few.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews)
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.