Abstract

A High-Dimensional and Incomplete (HDI) tensor is frequently encountered in a big data-related application concerning the complex dynamic interactions among numerous entities. Traditional tensor factorization-based models cannot handle an HDI tensor efficiently, while existing latent factorization of tensors models are all linear models unable to model an HDI tensor's nonlinearity. Motivated by this critical discovery, this paper proposes a Neural Latent Factorization of Tensors model, which provides a novel approach to nonlinear Canonical Polyadic decomposition on an HDI tensor. It is implemented with three-fold interesting ideas: a) adopting the density-oriented modeling principle to build rank-one tensor series with high computational efficiency and affordable storage cost; b) treating each rank-one tensor as a hidden neuron to achieve an efficient neural network structure; and c) developing an adaptive backward propagation (ABP) learning scheme for efficient model training. Experimental results on six HDI tensors from a real system demonstrate that compared with state-of-the-art models, the proposed model achieves significant performance gain in both convergence rate and accuracy. Hence, it is of great significance in performing challenging HDI tensor analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call