Abstract

In recent years, quaternion matrix completion (QMC) based on low-rank regularization has been gradually used in image processing. Unlike low-rank matrix completion (LRMC) which handles RGB images by recovering each color channel separately, QMC models retain the connection of three channels and process them as a whole. Most of the existing quaternion-based methods formulate low-rank QMC (LRQMC) as a quaternion nuclear norm (a convex relaxation of the rank) minimization problem. The main limitation of these approaches is that they minimize the singular values simultaneously such that cannot approximate low-rank attributes efficiently. To achieve a more accurate low-rank approximation, we introduce a quaternion truncated nuclear norm (QTNN) for LRQMC and utilize the alternating direction method of multipliers (ADMM) to get the optimization in this paper. Further, we propose weights to the residual error quaternion matrix during the update process for accelerating the convergence of the QTNN method with admissible performance. The weighted method utilizes a concise gradient descent strategy which has a theoretical guarantee in optimization. The effectiveness of our method is illustrated by experiments on real visual data sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call