Abstract

The low n-rank tensor recovery problem is an interesting extension of the compressed sensing. This problem consists of finding a tensor of minimum n-rank subject to linear equality constraints and has been proposed in many areas such as data mining, machine learning and computer vision. In this paper, operator splitting technique and convex relaxation technique are adapted to transform the low n-rank tensor recovery problem into a convex, unconstrained optimization problem, in which the objective function is the sum of a convex smooth function with Lipschitz continuous gradient and a convex function on a set of matrices. Furthermore, in order to solve the unconstrained nonsmooth convex optimization problem, an accelerated proximal gradient algorithm is proposed. Then, some computational techniques are used to improve the algorithm. At the end of this paper, some preliminary numerical results demonstrate the potential value and application of the tensor as well as the efficiency of the proposed algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.