Abstract

Most matrix reconstruction methods assume that missing entries randomly distribute in the incomplete matrix, and the low-rank prior or its variants are used to well pose the problem. However, in practical applications, missing entries are structurally rather than randomly distributed, and cannot be handled by the rank minimization prior individually. To remedy this, this paper introduces new matrix reconstruction models using double priors on the latent matrix, named Reweighted Low-rank and Sparsity Priors (ReLaSP). In the proposed ReLaSP models, the matrix is regularized by a low-rank prior to exploit the inter-column and inter-row correlations, and its columns (rows) are regularized by a sparsity prior under a dictionary to exploit intra-column (-row) correlations. Both the low-rank and sparse priors are reweighted on the fly to promote low-rankness and sparsity, respectively. Numerical algorithms to solve our ReLaSP models are derived via the alternating direction method under the augmented Lagrangian multiplier framework. Results on synthetic data, image restoration tasks, and seismic data interpolation show that the proposed ReLaSP models are quite effective in recovering matrices degraded by highly structural missing and various types of noise, complementing the classic matrix reconstruction models that handle random missing only.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.