Abstract

Linear discriminant analysis (LDA) with its extensions is a group of classical methods in dimensionality reduction for supervised learning. However, when some classes are far away from the others, it may be difficult to find the optimal direction by LDA because of the average between-class scatter. Moreover, LDAs are always time consuming for high dimensional problem since the involved generalized eigenvalue problem is needed to be solved. In this paper, a multiple between-class linear discriminant analysis (MBLDA) is proposed for dimensionality reduction. MBLDA finds the transformation directions by approximating the solution to a min-max programming problem, leading to well separability in the reduced space with a fast learning speed on the high-dimensional problem. It is proved theoretically that the proposed method can deal with the special generalized eigenvalue problem by solving a underdetermined homogeneous system of linear equations. Experimental results on the artificial and benchmark datasets show that MBLDA can not only reduce the dimension to a powerful linear discriminant level but also have a fast learning speed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call