Abstract

Principal component analysis (PCA) is a popular dimension reduction technique for vector data. Factored PCA (FPCA) is a probabilistic extension of PCA for matrix data, which can substantially reduce the number of parameters in PCA while yield satisfactory performance. However, FPCA is based on the Gaussian assumption and thereby susceptible to outliers. Although the multivariate t distribution as a robust modeling tool for vector data has a very long history, its application to matrix data is very limited. The main reason is that the dimension of the vectorized matrix data is often very high, and the higher the dimension, the lower the breakdown point that measures the robustness. To solve the robustness problem suffered by FPCA and make it applicable to matrix data, in this paper a robust extension of FPCA (RFPCA) is proposed, which is built upon a t-type distribution called matrix-variate t distribution. Like the multivariate t distribution, the matrix-variate t distribution can adaptively down-weight outliers and yield robust estimates. A fast EM-type algorithm for parameter estimation is developed. Experiments on synthetic and real-world datasets reveal that (i) RFPCA is compared favorably with several closely related methods. Importantly, RFPCA has a significantly higher breakdown point than its vector-based cousin multivariate t PCA (tPCA), which makes RFPCA more applicable to matrix data; (ii) the expected latent weights of RFPCA can be readily used for outlier detection, and they are much more reliable than those by tPCA. Such detection is rarely available with existing matrix-based methods, especially for gross matrix-valued outliers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call