Abstract

Linear Discriminant Analysis (LDA) is one of the most popular approaches for supervised feature extraction and dimension reduction. However, the computation of LDA involves dense matrices eigendecomposition, which is time-consuming for large-scale problems. In this paper, we present a novel algorithm called Rayleigh–Ritz Discriminant Analysis (RRDA) for efficiently solving LDA. While much of the prior research focus on transforming the generalized eigenvalue problem into a least squares formulation, our method is instead based on the well-established Rayleigh–Ritz framework for general eigenvalue problems and seeks to directly solve the generalized eigenvalue problem of LDA. By exploiting the structures in LDA problems, we are able to design customized and highly efficient subspace expansion and extraction strategy for the Rayleigh–Ritz procedure. To reduce the storage requirement and computational complexity of RRDA for high dimensional, low sample size data, we also establish an equivalent reduced model of RRDA. Practical implementations and the convergence result of our method are also discussed. Our experimental results on several real world data sets indicate the performance of the proposed algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.