Abstract

As an excellent discriminant classifier based on generating prior knowledge, the minimax probability machine (MPM) has been widely used and deeply researched in many fields. The core idea of minimax probability machine is to directly estimate probability accuracy bound by minimizing the maximum probability of misclassification. However, minimax probability machine does not include a regularization term for the construction of the separating hyperplane, and it needs to solve a large-scale second-order cone programming problem in the solution process, which greatly limits it development and application. In this paper, to improve the performance of minimax probability machine, we propose a novel binary classification method called regularized twin minimax probability machine classification (TMPMC). The TMPMC constructs two non-parallel hyperplanes for final classification by solving two smaller second-order cone programming problems to improve the performance of the MPM. For each hyperplane, our method is theoretically well grounded on the idea of minimizing the worst case (maximum) probability of misclassification of a class of samples while the distance to the other class is as large as possible. Our approach was first derived as linear methods, and subsequently extended as kernel-based strategies for nonlinear classification. Additionally, we extend TMPMC to the regression problem and propose a new regularized twin minimax probability machine regression (TMPMR). Experimental results on several datasets show that our methods are competitive in terms of generalization performance compared to other algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call