Abstract

<span lang="EN-US">The support vector machine (SVM), one of the most effective learning algorithms, has many real-world applications. The kernel type and its parameters have a significant impact on the SVM algorithm's effectiveness and performance. In machine learning, choosing the feature subset is a crucial step, especially when working with high-dimensional data sets. These crucial criteria were treated independently in the majority of earlier studies. In this research, we suggest a hybrid strategy based on the Harris Hawk optimization (HHO) algorithm. HHO is one of the lately suggested metaheuristic algorithms that has been demonstrated to be used more efficiently in facing some optimization problems. The suggested method optimizes the SVM model parameters while also locating the optimal features subset. </span><span lang="EN-AU">We ran the proposed approach </span><span lang="EN-US">HHO-SVM </span><span lang="EN-AU">on real biomedical datasets with 17 types of cancer for Iraqi patients in 2010-2012. </span><span lang="EN-US">The experimental results demonstrate the supremacy of the proposed HHO-SVM in terms of three performance metrics: feature selection accuracy, runtime, and number of selected features. The suggested method is contrasted with four well-known algorithms for verification: firefly (FF) algorithm, genetic algorithm (GA), grasshopper optimization algorithm (GOA), and particle swarm algorithm (PSO). The implementation of the proposed HHO-SVM approach reveals 99.967% average accuracy.</span>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call