Abstract

When mean vectors and covariance matrices of two classes are available in a binary classification problem, Lanckriet et al. [6] propose a minimax approach for finding a linear classifier which minimizes the worst-case (maximum) misclassification probability. In this paper, we extend the minimax approach to a multiple classification problem, where the number m of classes could be more than two. Assume that mean vectors and covariance matrices of all the classes are available, but no further assumptions are made with respect to class-conditional distributions. Then we define a problem for finding linear classifiers which minimize the worst-case misclassification probability ᾱ. Unfortunately, no efficient algorithms for solving the problem are known. So we introduce the maximum pairwise misclassification probability β instead of ᾱ. It is shown that β is a lower bound of ᾱ and a good approximation of ᾱ when m or ᾱ are small. We define a problem for finding linear classifiers which minimize the probability β and show some basic properties of the problem. Then the problem is transformed to a parametric Second Order Cone Programming problem (SOCP). We propose an algorithm for solving the problem by using nice properties of it. We conduct preliminary numerical experiments and confirm that classifiers computed by our method work very well to benchmark problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.