Abstract

Generative adversarial network (GAN) has been widely adopted in recommender systems (RSs) to improve the recommendation accuracy. However, existing GAN-based models often suffer from the mode collapse problem in sparse environments and fail to adequately capture the complexity of user preferences and behaviors, which affects recommendation performance. To address these issues, we introduce a diffusion model (DM) into the GAN framework, proposing an efficient Diffusion-GAN recommendation model (DGRM) to achieve mutual enhancement between the two generative models. This model first utilizes the forward process of DM to generate conditional vectors that guide the training of the GAN generator. Subsequently, the backward process of DM assists the GAN discriminator using Wasserstein distance during adversarial training. The Wasserstein distance is adopted to solve the asymmetry of Kullback-Leibler (KL) divergence as a loss function in traditional GANs. Experiments on multiple datasets demonstrate that the proposed model effectively alleviates mode collapse and surpasses other state-of-the-art (SOTA) methods in various evaluation metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.