Abstract

Generalized eigenvalue proximal support vector machines (GEPSVMs) are a simple and effective binary classification method in which each hyperplane is closest to one of the two classes and as far as possible from the other class. They solve a pair of generalized eigenvalue problems to obtain two nonparallel hyperplanes. Multiview learning considers learning with multiple feature sets to improve the learning performance. In this paper, we propose multiview GEPSVMs (MvGSVMs) which effectively combine two views by introducing a multiview co-regularization term to maximize the consensus on distinct views, and skillfully transform a complicated optimization problem to a simple generalized eigenvalue problem. We also propose multiview improved GEPSVMs (MvIGSVMs), which use the minus instead of ratio in MvGSVMs to measure the differences of the distances between the two classes and the hyperplane and lead to a simpler eigenvalue problem. Linear MvGSVMs and MvIGSVMs are generalized to the nonlinear case by the kernel trick. Experimental results on multiple data sets show the effectiveness of our proposed approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.