Abstract

The low n-rank tensor recovery problem is an interesting extension of the compressed sensing. This problem consists of finding a tensor of minimum n-rank subject to linear equality constraints and has been proposed in many areas such as data mining, machine learning and computer vision. In this paper, operator splitting technique and convex relaxation technique are adapted to transform the low n-rank tensor recovery problem into a convex, unconstrained optimization problem, in which the objective function is the sum of a convex smooth function with Lipschitz continuous gradient and a convex function on a set of matrices. Furthermore, in order to solve the unconstrained nonsmooth convex optimization problem, an accelerated proximal gradient algorithm is proposed. Then, some computational techniques are used to improve the algorithm. At the end of this paper, some preliminary numerical results demonstrate the potential value and application of the tensor as well as the efficiency of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call