Abstract

Incremental learning has attracted increasing attention in the past decade. Since many real tasks are high-dimensional problems, dimensionality reduction is the important step. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are two of the most widely used dimensionality reduction algorithms. However, PCA is an unsupervised algorithm. It is known that PCA is not suitable for classification tasks. Generally, LDA outperforms PCA when classification problem is involved. However, the major shortcoming of LDA is that the performance of LDA is degraded when encountering singularity problem. Recently, the modified LDA, Maximum margin criterion (MMC) was proposed to overcome the shortcomings of PCA and LDA. Nevertheless, MMC is not suitable for incremental data. The paper proposes an incremental extension version of MMC, called Incremental Maximum margin criterion (IMMC) to update projection matrix when new observation is coming, without repetitive learning. Since the approximation intermediate eigenvalue decomposition is introduced, it is low in computational complexity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call