Abstract

The bilinear probabilistic principal component analysis (BPPCA) was introduced recently as a model-based dimension reduction technique on matrix data. However, BPPCA is based on the Gaussian assumption and hence is vulnerable to potential outlying matrix-valued observations. In this article, we present a new robust extension of BPPCA, called BPPCA using a matrix variate t distribution ( t BPPCA), that is built upon a matrix variate t distribution. Like the multivariate t , this distribution offers an additional robustness tuning parameter, which can downweight outliers. By introducing a Gamma distributed latent weight variable, this distribution can be represented hierarchically. With this representation, two efficient accelerated expectation-maximization (EM)-like algorithms for parameter estimation are developed. Experiments on a number of synthetic and real datasets are conducted to understand t BPPCA and compare with several closely related competitors, including its vector-based counterpart. The results reveal that t BPPCA is generally more robust and accurate in the presence of outliers. Moreover, the expected latent weights under t BPPCA can be effectively used for outliers' detection, which is much more reliable than its vector-based counterpart due to its better robustness.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.