Abstract

The proximal support vector machine via generalized eigenvalues (GEPSVM) is a well-known pattern classification method. GEPSVM, however, is prone to outliers due to its use of the squared L2-norm distance criterion. A robust GEPSVM version is proposed to tackle this problem using L1-norm distance optimization technique (GEPSVML1). As optimizing a GEPSVML1 with L1-norm terms can be challenging, we have developed an iterative algorithm to address the L1-norm ratio problem associated with GEPSVML1. Furthermore, an efficient iterative optimization framework has been developed to conveniently address related optimization problems. The research contribution of this paper lies in providing a theoretical analysis of the algorithm’s convergence. Besides, we find that the L1-norm distance-based methods in real-world applications, especially for handling samples with outliers, sometimes provides an unsatisfactory recognition result. Thus, a generalized version of GEPSVML1 is proposed. The L1-norm distance is replaced with a Lp-norm distance in GEPSVML1 (GEPSVMLp). It is the robust counterpart of GEPSVML1 and GEPSVM. It is worth noting that we fine-tune GEPSVMLp’s parameters to strike a balance between training time and classification accuracy, an especially crucial step for larger datasets. Our experiments indicate that the proposed GEPSVMLp is more efficient and robust than the competitors in numerous experimental settings. Overall, our work demonstrates the importance of developing robust pattern classification methods in the presence of outliers and provides a practical solution for handling such cases in real-world applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call