Abstract

Numerous face recognition algorithms use principal component analysis (PCA) as the first step for dimensionality reduction (DR) followed by linear discriminant analysis (LDA). PCA is applied in the beginning because it performs the DR in the minimum square error sense and achieves the most compact representation of data. However, they lack discrimination ability. To optimize classification, LDA and its variants are applied to the PCA reduced subspace so that the transformed data achieves minimum within-class variation and maximum between-class variations. In this paper, we study total, within-class and between-class scatter matrices and their roles in DR or feature extraction with good discrimination ability. The number of dimensions retained in DR plays a very crucial role for subsequent discriminant analysis. We reveal some important aspect of how recognition rate varies using different scatter matrices and their stepwise DR. Experimental results on popular face databases are provided to support our findings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.