Abstract

In machine learning, a crucial task is feature selection in that the computational cost will be increased exponentially with increases in problem complexity. To reduce the dimensionality of medical datasets and reduce the computational cost, multi-objective optimization approaches are mainly utilized by researchers. Similarly, for improving the population diversity of the Flamingo Search Algorithm, the neighbourhood centroid opposition-based learning mutation is employed. In this paper, to improve the classification accuracy, enhance their exploration capability in the search space and reduce the computational cost while increasing the size of dataset, neighbourhood centroid opposition-based learning (NCOBL) is integrated into the multi-objective optimization based Flamingo Search Algorithm (MOFSA). The optimal selected datasets are classified by using the weighted K-Nearest Neighbour classifier. With the use of fifteen benchmark medical datasets, the efficacy of the suggested strategy is assessed in terms of recall, precision, accuracy, running time, F-measure, hamming loss, ranking loss, standard deviation, mean value error, and size of the selected features. Then the performance of the suggested feature selection technique is compared to that of the existing approaches. The suggested method produced a minimum mean value, standard deviation, mean hamming loss, and maximum accuracy of about 99%. The experimental findings demonstrate that the suggested method may enhance classification accuracy and also eliminate redundancy in huge datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call