Abstract

Dimensionality reduction has become a ubiquitous preprocessing step in many applications. Linear discriminant analysis (LDA) has been known to be one of the most optimal dimensionality reduction methods for classification. However, a main disadvantage of LDA is that the so-called total scatter matrix must be nonsingular. But, in many applications, the scatter matrices can be singular since the data points are from a very high-dimensional space, and thus usually the number of the data samples is smaller than the data dimension. This is known as the undersampled problem. Many generalized LDA methods have been proposed in the past to overcome this singularity problem. There is a commonality for these generalized LDA methods; that is, they compute the optimal linear transformations by computing some eigen-decompositions and involving some matrix inversions. However, the eigen-decomposition is computationally expensive, and the involvement of matrix inverses may lead to the methods not numerically stable if the associated matrices are ill-conditioned. Hence, many existing LDA methods have high computational cost and have potential numerical instability problems. In this paper we present a new orthogonal LDA method for the undersampled problem. The main features of our proposed LDA method include the following: (i) the optimal transformation matrix is obtained easily by only orthogonal transformations without computing any eigen-decomposition and matrix inverse, and, consequently, our LDA method is inverse-free and numerically stable; (ii) our LDA method is implemented by using several QR factorizations and is a fast one. The effectiveness of our new method is illustrated by some real-world data sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.