Abstract

Incremental and decremental problems are challenging tasks in semi-supervised learning. The incremental semi-supervised discriminant analysis (ISSDA) method proposed by Dhamecha et al. is an efficient method for incremental semi-supervised learning. However, one deficiency of the ISSDA method is that the total scatter matrix remains unchanged during incremental learning, which is impractical in practice. On the other hand, there may be a series of incorrectly artificial labeling in the public data set, and it is interesting to consider the decremental problem in semi-supervised learning. To the best of our knowledge, however, there are few decremental algorithms for semi-supervised discriminant analysis. The contributions of this work are as follows. First, a new incremental semi-supervised discriminant analysis method is proposed, in which we consider updating the total scatter matrix and the between-class scatter matrix simultaneously when new samples are added. Second, we show how to solve the large eigenproblem of the updated total scatter matrix efficiently. Third, we propose two decremental algorithms for semi-supervised discriminant analysis. Numerical experiments demonstrate the superiority of the proposed algorithms over many state-of-the-art algorithms for semi-supervised discriminant analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call