Abstract

AbstractDimension reduction is critical in many areas of pattern classification and machine learning and many algorithms have been proposed. Pairwise Covariance-preserving Projection Method (PCPM) was an effective dimension reduction which maximizes the class discrimination and also preserves approximately the pairwise class covariances. A shortcoming of PCPM is that it can only be applied when all labels are given, thus a typical supervised method. Semi-supervised has attracted much attention in recent years since they can utilize both labeled and unlabeled data. In this paper, we extend PCPM to semi-supervised setting. The labeled data points are used to maximize the separability between different classes and the unlabeled data points are used to estimate the intrinsic geometric structure of the data. Specifically, we aim to learn a discriminant function which is as smooth as possible on the data manifold. The target optimization problem involved can be solved efficiently by eigenvalue decomposition. Experimental results on several datasets demonstrate the effectiveness of our method.KeywordsDimension reductionSemi-supervised classification

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.