Abstract

Deep reinforcement learning (DRL)-based recommender systems have recently come into the limelight due to their ability to optimize long-term user engagement. A significant challenge in DRL-based recommender systems is the large action space required to represent a variety of items. The large action space weakens the sampling efficiency and thereby, affects the recommendation accuracy. In this article, we propose a DRL-based method called deep hierarchical category-based recommender system (DHCRS) to handle the large action space problem. In DHCRS, categories of items are used to reconstruct the original flat action space into a two-level category-item hierarchy. DHCRS uses two deep Q -networks (DQNs): 1) a high-level DQN for selecting a category and 2) a low-level DQN to choose an item in this category for the recommendation. Hence, the action space of each DQN is significantly reduced. Furthermore, the categorization of items helps capture the users' preferences more effectively. We also propose a bidirectional category selection (BCS) technique, which explicitly considers the category-item relationships. The experiments show that DHCRS can significantly outperform state-of-the-art methods in terms of hit rate and normalized discounted cumulative gain for long-term recommendations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.