Abstract

Big-data related applications frequently concern how to analyze large-scale undirected weighted network effectively. Such a network can be quantized into a Symmetric, High-Dimensional and Sparse (SHiDS) matrix, owing to its sparsity and symmetry. For considering these characteristics of an SHiDS matrix with care, a symmetric non-negative latent factor (SNLF) model is proposed. However, representation learning ability of an SNLF model is limited owing to its commonly-adopted learning objective, i.e., Euclidean distance. For addressing this issue, this study proposes a generalized symmetric nonnegative latent factor analysis (GSNL) model. Its main idea is two-fold: a) adopting <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$\alpha-\beta$</tex> -divergence to generalize SNLF's learning objective for achieving accurate representation ability of an SHiDS matrix; and b) utilizing a self-adaptive scheme on all involved hyperparameters brought by the resultant model for strong practicability. Empirical studies on four SHiDS matrices demonstrate that a GSNL model outperforms its peers regarding accuracy gain and computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call