Abstract

Linear discriminant analysis (LDA) has been widely used for linear dimension reduction. However, LDA has limitations in that one of the scatter matrices is required to be nonsingular and the nonlinearly clustered structure is not easily captured. In order to overcome the problems caused by the singularity of the scatter matrices, a generalization of LDA based on the generalized singular value decomposition (GSVD) was recently developed. In this paper, we propose a nonlinear discriminant analysis based on the kernel method and the GSVD. The GSVD is applied to solve the generalized eigenvalue problem which is formulated in the feature space defined by a nonlinear mapping through kernel functions. Our GSVD-based kernel discriminant analysis is theoretically compared with other kernel-based nonlinear discriminant analysis algorithms. The experimental results show that our method is an effective nonlinear dimension reduction method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call