Abstract

We discuss the inductive classification problem by proposing a joint framework termed Adaptive Non-negative Projective Semi-Supervised Learning (ANP-SSL). Specifically, ANP-SSL integrates the adaptive inductive label propagation, adaptive reconstruction weights learning and the neighborhood preserving projective nonnegative matrix factorization (PNMF) explicitly. To make the label prediction results more accurate, ANP-SSL incorporates the semi-supervised data representation and classification errors into regular PNMF for minimization, which can enable our ANP-SSL to perform the adaptive weights learning and label propagation over the spatially local and part-based data representations, which differs from most existing work that usually assign weights and predict labels based on the original data that often has noise and corruptions. Moreover, existing methods usually pre-assign weights before the process of label estimation, but such operation cannot ensure the learnt weights by independent step to be optimal for the subsequent classification. The combined representation error can also make the learnt reduced part-based representations of neighborhood preserving PNMF, which can potentially enhance the prediction results. By minimizing the classification error jointly over the neighborhood preserving nonnegative representation can make the embedding based classification efficient. Extensive results on several public image databases verified the effectiveness of our ANP-SSL, compared with other state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call