Abstract
Linear discriminant analysis (LDA) is a well-known feature extraction method in statistical pattern recognition community. The basic idea of LDA is to find a set of optimal discriminant vectors that maximize the Fisher's discriminant criterion. One major problem of the Fisher's criterion is that its discriminant performance largely depends on the class mean differences. Hence, the LDA method may not work well as for the case of the heteroscedastic problem since it can not make use of the discriminant information from the class covariance differences. To this end, in this paper we propose a new discriminant criterion consisting of both class mean and covariance differences to replace the Fisher's criterion. Based on the new discriminant criterion, we propose a heteroscedastic extension method of linear discriminant analysis (namely the HELDA method). We also propose an approximate solution method for HELDA (AHELDA) via matrices joint diagonalization (JD) to reduce the computational complexity. The extensive experiments on texture classification confirm the better classification performance of our method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.