Abstract

The calculation of a low-rank approximation to a matrix is fundamental to many algorithms in computer vision and other fields. One of the primary tools used for calculating such low-rank approximations is the Singular Value Decomposition, but this method is not applicable in the case where there are outliers or missing elements in the data. Unfortunately, this is often the case in practice. We present a method for low-rank matrix approximation which is a generalization of the Wiberg algorithm. Our method calculates the rank-constrained factorization, which minimizes the L1 norm and does so in the presence of missing data. This is achieved by exploiting the differentiability of linear programs, and results in an algorithm can be efficiently implemented using existing optimization software. We show the results of experiments on synthetic and real data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.