Abstract
Recently, collaborative filtering combined with various kinds of deep learning models is appealing to recommender systems, which have shown a strong positive effect in an accuracy improvement. However, many studies related to deep learning model rely heavily on abundant information to improve prediction accuracy, which has stringent data requirements in addition to raw rating data. Furthermore, most of them ignore the interaction effect between users and items when building the recommendation model. To address these issues, we propose DCCR, a deep collaborative conjunctive recommender, for rating prediction tasks that are solely based on the raw ratings. A DCCR is a hybrid architecture that consists of two different kinds of neural network models (i.e., an autoencoder and a multilayered perceptron). The main function of the autoencoder is to extract the latent features from the perspectives of users and items in parallel, while the multilayered perceptron is used to represent the interaction between users and items based on fusing the user and item latent features. To further improve the performance of DCCR, an advanced activation function is proposed, which can be specified with input vectors. The extensive experiments conducted with two well-known real-world datasets and performances of the DCCR with varying settings are analyzed. The results demonstrate that our DCCR model outperforms other state-of-art methods. We also discuss the performance of the DCCR with additional layers to show the extensibility of our model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.