Abstract

In this paper we propose a more efficient and effective algorithm of generalized discriminant analysis (GDA) by performing Gram-Schmidt orthogonalization procedure in feature space only once on difference vectors. The proposed method is substantially equivalent to class-incremental GDA [W. Zheng, ?class-Incremental generalized discriminant analysis?, neural computation 18, 979-1006 (2006)], since both methods search the essentially equivalent nonlinear optimal discriminative vectors in the range space of total scatter matrix and the null space of within-class scatter matrix. But since there is no need to compute the class mean in the proposed method as needed in class-incremental GDA, the computational cost is reduced greatly in the proposed method. The experiments on two standard face databases verified the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.