Abstract

In the current scenario, feature selection (FS) remains one of the very important functions in machine learning. Decreasing the feature set (FSt) assists in enhancing the classifier’s accuracy. Because of the existence of a huge quantity of data within the dataset (DS), it remains a colossal procedure for choosing the requisite features out of the DS. Hence, for resolving this issue, a new Chaos Quasi-Oppositional-based Flamingo Search Algorithm with Simulated Annealing Algorithm (CQOFSASAA) has been proffered for FS and for choosing the optimum FSt out of the DSs, and, hence, this lessens the DS’ dimension. The FSA technique can be employed for selecting the optimal feature subset out of the DS. Generalized Ring Crossover has been as well embraced for selecting the very pertinent features out of the DS. Lastly, the Kernel Extreme Learning Machine (KELM) classifier authenticates the chosen features. This proffered paradigm’s execution has been tested by standard DSs and the results have been correlated with the rest of the paradigms. From the experimental results, it has been confirmed that this proffered CQOFSASAA attains 93.74% of accuracy, 92% of sensitivity, and 92.1% of specificity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call