Abstract

Recently, recommender systems are getting popular in the e-commerce industry for retrieving and recommending most relevant information about items for users from large amounts of data. Different stochastic gradient descent (SGD) based adaptive strategies have been proposed to make recommendations more precise and efficient. In this paper, we propose a fractional variant of the standard SGD, named as fractional stochastic gradient descent (FSGD), for recommender systems. We compare its convergence and estimated accuracy with standard SGD against a number of features with different learning rates and fractional orders. The performance of our proposed method is evaluated using the root mean square error (RMSE) as a quantitative evaluation measure. We examine that the proposed strategy is more accurate in terms of RMSE than the standard SGD for all values of fractional orders and different numbers of features. The contribution of fractional calculus has not been explored yet to solve the recommender systems problem; therefore, we exploit FSGD for solving this problem. The results show that our proposed method performs significantly well in terms of estimated accuracy and convergence as compared to the standard SGD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call