Abstract

A large-scale undirected weighted network (LUWN) is described by a symmetric high-dimensional and sparse (SHiDS) matrix. To correctly represent its symmetry, existing models adopt a strong symmetry assumption, i.e., reducing the solution space to adopt a unique latent factor matrix to describe an SHiDS matrix’s symmetry. Yet it may impair a resultant model’s representativeness to its numerical features. Aiming at addressing this issue, this work proposes a Relaxed Symmetric Non-negative latent factor analysis (RSN) model that adopts three-fold ideas: a) Introducing a triple equation constraint into its learning objective for relaxing the strong symmetry assumption, thereby greatly improving its representativeness to the numerical features of an SHiDS matrix; b) Adopting the framework of Alternating Direction Method of Multipliers to fast realize its learning objective subject to multiple constraints; and c) utilizing a data density-oriented principle during its modeling and optimization, thereby precisely representing an SHiDS matrix’s imbalanced data. Empirical studies on four industrial SHiDS matrices describing real LUWNs demonstrate that RSN outperforms state-of-the-art models in representing an SHiDS matrix precisely, as well as achieves highly competitive computational efficiency. Hence, this work greatly advances the area of LUWN analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call