Abstract

High-dimensional and incomplete (HDI) data subject to the nonnegativity constraints are commonly encountered in a big data-related application concerning the interactions among numerous nodes. A nonnegative latent factor analysis (NLFA) model can perform representation learning to HDI data efficiently. However, existing NLFA models suffer from either slow convergence rate or representation accuracy loss. To address this issue, this paper proposes a proximal alternating-direction-method-of-multipliers-based nonnegative latent factor analysis (PAN) model with two-fold ideas: 1) adopting the principle of alternating-direction-method-of-multipliers to implement an efficient learning scheme for fast convergence and high computational efficiency; and 2) incorporating the proximal regularization into the learning scheme to suppress the optimization fluctuation for high representation learning accuracy to HDI data. Theoretical studies verify that PAN converges to a Karush-Kuhn-Tucker (KKT) stationary point of its nonnegativity-constrained learning objective with its learning scheme. Experimental results on eight HDI matrices from real applications demonstrate that the proposed PAN model outperforms several state-of-the-art models in both estimation accuracy for missing data of an HDI matrix and computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call