Abstract
In problems of machine learning, selection of important features is an important pre-processing step. Many large datasets contain redundant information in the form of non-informative features. Sometimes the features can be irrelevant and noisy. Feature selection is the process of obtaining a minimal feature subset that sufficiently represents the original set of features. This subset of features can subsequently be used for tasks such as classification, clustering and inference. Most of the work on feature selection found in literature is either supervised or unsupervised. In this paper, an Adaptive Neuro-Fuzzy Inference System (ANFIS)-based feature ranking and selection approach which is supervised in nature is presented. For feature ranking, keep one-leave the rest and leave one-keep the rest strategies are used. A greedy forward feature selection approach is also presented, which aims to select a feature subset. The results of the proposed approaches are compared with the results obtained by a well-known unsupervised neuro-fuzzy feature selection algorithm and the supervised Relief-F algorithm. The findings are quite encouraging.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.