Abstract
Existing book recommendation methods often overlook the rich information contained in the comment text, which can limit their effectiveness. Therefore, a cross-domain recommender system for literary books that leverages multi-head self-attention interaction and knowledge transfer learning is proposed. Firstly, the BERT model is employed to obtain word vectors, and CNN is used to extract user and project features. Then, higher-level features are captured through the fusion of multi-head self-attention and addition pooling. Finally, knowledge transfer learning is introduced to conduct joint modeling between different domains by simultaneously extracting domain-specific features and shared features between domains. On the Amazon dataset, the proposed model achieved MAE and MSE of 0.801 and 1.058 in the “movie-book” recommendation task and 0.787 and 0.805 in the “music-book” recommendation task, respectively. This performance is significantly superior to other advanced recommendation models. Moreover, the proposed model also has good universality on the Chinese dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Data Warehousing and Mining
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.