Abstract

Large-scale undirected weighted networks are frequently encountered in big-data-related applications concerning interactions among a large unique set of entities. Such a network can be described by a Symmetric, High-Dimensional, and Incomplete (SHDI) matrix whose symmetry and incompleteness should be addressed with care. However, existing models fail in either correctly representing its symmetry or efficiently handling its incomplete data. For addressing this critical issue, this study proposes an Alternating-Direction-Method of Multipliers (ADMM)-based Symmetric Non-negative Latent Factor Analysis (ASNL) model. It adopts fourfold ideas: 1) implementing the data density-oriented modeling for efficiently representing an SHDI matrix's incomplete and imbalanced data; 2) separating the non-negative constraints from the decision parameters to avoid truncations during the training process; 3) incorporating the ADMM principle into its learning scheme for fast model convergence; and 4) parallelizing the training process with load balance considerations for high efficiency. Empirical studies on four SHDI matrices demonstrate that ASNL significantly outperforms several state-of-the-art models in both prediction accuracy for missing data of an SHDI and computational efficiency. It is a promising model for handling large-scale undirected networks raised in real applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call