Abstract
Support Vector Machines (SVMs) have gained prominence in machine learning for their capability to establish optimal decision boundaries in high-dimensional spaces. SVMs are powerful machine learning models but can encounter difficulties in achieving optimal performance due to challenges such as selecting appropriate kernel parameters, handling uncertain data, and adapting to complex decision boundaries.. This paper introduces a novel hybrid approach to enhance the performance of Support Vector Machines (SVM) through the integration of the Davidon-Fletcher-Powell (DFP) optimization algorithm and Elephant Herding Optimization (EHO) for parameter tuning. SVM, a robust machine learning algorithm, relies on effective hyperparameter selection for optimal performance. The proposed hybrid model synergistically leverages DFP's efficiency in unconstrained optimization and EHO's exploration-exploitation balance inspired by elephant herding behavior. The fusion of these algorithms address the challenges associated with traditional optimization methods. The hybrid model offers improved convergence towards the global optimum. Experimental results demonstrate the efficacy of the approach, showcasing enhanced SVM performance in terms of minimum 3.3% accuracy and 3.4% efficiency. This research contributes to advancing the field of metaheuristic optimization in machine learning, providing a promising avenue for effective parameter optimization in SVM applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.