Abstract
In this paper, we propose a novel discriminative dimension reduction (DR) method, maximin separation probability analysis (MSPA), which maximizes the minimum separation probability of all classes in the reduced low-dimensional subspace. Separation probability is a novel class separability measure, which gives a lower bound of the generalization accuracy for a learned linear classifier in a binary classification problem. The proposed MSPA duly considers the separation of all class pairs in multiclass linear discriminant analysis (LDA) and thus improves the subsequent classification performance. DR via MSPA leads to a nonconvex optimization problem. We develop an algorithm to solve the problem and the global optimal solution can be found by converting the original problem into a series of second-order cone programming problems. A low-computational cost extension and a non-LDA with kernel mapping of MSPA are also provided in this paper. The experimental results on 14 real-world datasets show our methods are superior to other state-of-the-art algorithms in discriminative DR tasks.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have