Abstract
Principal Component Analysis (PCA) is a technique to transform the original set of variables into a smaller set of linear combinations that account for most of the original set variance. The data reduction based on the classical PCA is fruitless if outlier is present in the data. The decomposed classical covariance matrix is very sensitive to outlying observations. ROBPCA is an effective PCA method combining two advantages of both projection pursuit and robust covariance estimation. The estimation is computed with the idea of minimum covariance determinant (MCD) of covariance matrix. The limitation of MCD is when covariance determinant almost equal zero. This paper proposes PCA using the minimum vector variance (MVV) as new measure of robust PCA to enhance the result. MVV is defined as a minimization of sum of square length of the diagonal of a parallelotope to determine the location estimator and covariance matrix. The usefulness of MVV is not limited to small or low dimension data set and to non-singular or singular covariance matrix. The MVV algorithm, compared with FMCD algorithm, has a lower computational complexity; the complexity of VV is of order O(p 2).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.