Abstract

The matrix-based features can provide valid and interpretable information for matrix-based data such as image. Matrix-based kernel principal component analysis (MKPCA) is a way for extracting matrix-based features. The extracted matrix-based feature is useful to both dimension reduction and spatial statistics analysis for an image. In contrast, the efficiency of MKPCA is highly restricted by the dimension of the given matrix data and the size of the training set. In this paper, an incremental method to extract features of a matrix-based dataset is proposed. The method is methodologically consistent with MKPCA and can improve efficiency through incrementally selecting the proper projection matrix of the MKPCA by rotating the current subspace. The performance of the proposed method is evaluated by performing several experiments on both point and image datasets.

Highlights

  • Subspace analysis is helpful for a network in computer vision [1, 2] and data modeling problems [3] and social network [4]

  • In order to measure the accuracy of the proposed method in approximating Matrix-based kernel principal component analysis (MKPCA) and to assess the quality of the solution objectively, a distance measure based on the angles between principal kernel components is employed as follows: θw arccos􏼐􏼌􏼌􏼌􏼌vw · vw∗ 􏼌􏼌􏼌􏼌􏼑, (17)

  • We investigate how the number of original data affects the effectiveness of incremental matrix-based KPCA (IMKPCA). e measure equation (17) is used as a distance between subspace computed by IMKPCA and the ground truth computed by MKPCA

Read more

Summary

Introduction

Subspace analysis is helpful for a network in computer vision [1, 2] and data modeling problems [3] and social network [4]. To enhance its modeling capability, the vector-based kernel PCA (KPCA) [10, 11] is proposed. It improves the performance of PCA in modeling by nonlinearly mapping the data from the original space to a very high-dimensional feature space, the so-called reproducing kernel Hilbert space (RKHS). Is paper proposes an incremental matrix-based KPCA (IMKPCA) method to approximating the traditional one with less computation time and memory usage for extracting kernel principal vectors under a certain accuracy.

Preliminaries
Incremental Matrix-Based Kernel Principal Component Analysis
Experiments
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.