Abstract

Two-dimensional linear discriminant analysis (2DLDA) is an effective matrix-based supervised dimensionality reduction method that expresses 2D data directly. However, 2DLDA magnifies the influence of outliers and noise since the construction of 2DLDA is based on squared Frobenius norm. To overcome its sensitivity, this paper investigates a two-dimensional capped l2,1-norm linear discriminant analysis, called 2DCLDA. The application of capped l2,1-norm makes 2DCLDA identify outliers and suppress the effect of noise effectively. To avoid singularity and give a stable performance, a regularization term is also considered in the between-class scatter. 2DCLDA is solved through a series of generalized eigenvalue problems, with each subproblem can be deemed as a weighted 2DLDA with regularization. It is proved that the proposed iteration algorithm monotonously decreases the objective of 2DCLDA. At last, 2DCLDA with its related approaches on several face image databases is compared, and the experimental results demonstrate the superiority of 2DCLDA, especially for noise data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.