Abstract

High-dimensional and sparse (HiDS) matrices are commonly encountered in many big-data-related industrial applications like recommender systems. Latent factor (LF) analysis via stochastic gradient descent (SGD) is greatly efficient in discovering latent patterns from them. However, as a sequential algorithm, SGD suffers considerable time cost and low scalability when handling large-scale problems. To address these issues, this study proposes parallelized, momentum-incorporated stochastic gradient descent (PMSGD) scheme, which incorporates momentum effects into an SGD scheme as well as implementing its parallelization via careful data splitting. Based on a PMSGD method, we achieve a PMSGD-based LF (PLF) model to execute fast LF analysis on HiDS matrices from a recommender system. Experimental results on two HiDS matrices arising from industrial applications indicate that owing to the careful design of PMSGD, a PLF model outperforms state-of-the-art parallel LF models significantly in terms of computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call