Support Vector Machines (SVMs) have gained prominence in machine learning for their capability to establish optimal decision boundaries in high-dimensional spaces. SVMs are powerful machine learning models but can encounter difficulties in achieving optimal performance due to challenges such as selecting appropriate kernel parameters, handling uncertain data, and adapting to complex decision boundaries.. This paper introduces a novel hybrid approach to enhance the performance of Support Vector Machines (SVM) through the integration of the Davidon-Fletcher-Powell (DFP) optimization algorithm and Elephant Herding Optimization (EHO) for parameter tuning. SVM, a robust machine learning algorithm, relies on effective hyperparameter selection for optimal performance. The proposed hybrid model synergistically leverages DFP's efficiency in unconstrained optimization and EHO's exploration-exploitation balance inspired by elephant herding behavior. The fusion of these algorithms address the challenges associated with traditional optimization methods. The hybrid model offers improved convergence towards the global optimum. Experimental results demonstrate the efficacy of the approach, showcasing enhanced SVM performance in terms of minimum 3.3% accuracy and 3.4% efficiency. This research contributes to advancing the field of metaheuristic optimization in machine learning, providing a promising avenue for effective parameter optimization in SVM applications.
Read full abstract