Abstract

ABSTRACTThe features related to the real world data may be redundant and erroneous in nature. The vital role of feature selection (FS) in handling such type of features cannot be ignored in the area of computational learning. The two most commonly used objectives for FS are the maximisation of the accuracy and minimisation of the number of features. This paper presents an Elitism-based Multi-objective Differential Evolution algorithm for FS and the novelty lies in the searching process which uses Minkowski Score (MS) and simultaneously optimises three objectives. The MS is considered as the third objective to keep track of the feature subset which is capable enough to produce a good classification result even if the average accuracy is poor. Extreme Learning Machine because of its fast learning speed and high efficiency has been considered with this multi-objective approach as a classifier for FS. Twenty-one benchmark datasets have been considered for performance evaluation. Moreover, the selected feature subsets are tested using 10-fold cross-validation. A comparative analysis of the proposed approach with two classical models, three single objective algorithms, and four multi-objective algorithms has been carried out to test the efficacy of the model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.