Abstract

Symmetric tensor decomposition is of great importance in applications. In this paper, we design a synchronized multi-rank symmetric Tensor Decomposition alternating minimization method. In this algorithm, we start from a careful initialization for the non-convex symmetric tensor decomposition and then perform an alternating minimization algorithm. Our contributions are as follows: (1) Our method is synchronized and there is no need for a greedy algorithm to get the multi-rank tensor decomposition . (2) Initialization is an important part in our proposed method. With a careful initialization, our proposed algorithm can converge to the global minimizer of the non-convex objective function. (3) The designed alternating minimization algorithm can give a highly accurate result. In numerical results, our proposed algorithm is much better than the simple gradient descent method from the same initialization. Moreover, our results show that with eigenvectors of random projection as initialization, we can quickly get the global solution by using simple alternating minimization algorithm, though finding the global minimum of this non-convex minimization problem is NP-hard .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call