Abstract

Symmetric nonnegative tensor factorization (SNTF) is an important tool for clustering analysis. To date, most of algorithms for SNTF are based on multiplicative update rules, which have many attractive properties, e.g., they are often simple to implement and can enforce nonnegativity without extra projection steps. However, the existing multiplicative algorithms often converge slowly due to the conservative multiplicative learning steps or the use of low-level BLAS (basic linear algebra subprograms) in their implementation. In this paper, three new multiplicative algorithms are proposed for SNTF to overcome the drawback of slow convergence. First, a parallel multiplicative algorithm, which can be implemented with high-level BLAS, is derived by auxiliary optimization. To further accelerate the convergence, two new parallel multiplicative algorithms, which enable larger learning steps for improving efficiency, are developed based on weighted geometric mean and weighted arithmetic mean, respectively. Finally, we apply the proposed algorithms to multiway probabilistic clustering, where a new hyper-stochastic normalization scheme based on Euclidean distance is developed for better data preprocessing. The experiment results on both synthetic and real-world data show that the proposed SNTF algorithms converge faster than the state-of-the-art algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call