Abstract
Support Vector Machine (SVM) classifiers are high-performance classification models devised to comply with the structural risk minimization principle and to properly exploit the kernel artifice of nonlinearly mapping input data into high-dimensional feature spaces toward the automatic construction of better discriminating linear decision boundaries. Among several SVM variants, Least-Squares SVMs (LS-SVMs) have gained increased attention recently due mainly to their computationally attractive properties coming as the direct result of applying a modified formulation that makes use of a sum-squared-error cost function jointly with equality, instead of inequality, constraints. In this work, we present a flexible hybrid approach aimed at augmenting the proficiency of LS-SVM classifiers with regard to accuracy/generalization as well as to hyperparameter calibration issues. Such approach, named as Mixtures of Weighted Least-Squares Support Vector Machine Experts, centers around the fusion of the weighted variant of LS-SVMs with Mixtures of Experts models. After the formal characterization of the novel learning framework, simulation results obtained with respect to both binary and multiclass pattern classification problems are reported, ratifying the suitability of the novel hybrid approach in improving the performance issues considered.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.