Abstract

Fast and effective recommender systems are fundamental to fulfill the growing requirements of the e-commerce industry. The strength of matrix factorization procedure based on stochastic gradient descent (SGD) algorithm is exploited widely to solve the recommender system problem. Modern computing paradigms are designed by utilizing the concept of fractional gradient in standard SGD and outperform the standard counterpart. The performance of fractional SGD improves considerably by adaptively tuning the learning rate parameter. A nonlinear computing paradigm based on normalized version of fractional SGD is developed in this paper to investigate the adaptive behavior of learning rate with novel application to recommender systems. The accuracy of the proposed approach is verified through root mean square error metric by using different latent features, learning rates, fractional orders and datasets. The superiority of the designed method is validated through comparison with the state-of-the-art counterparts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call