Abstract

Deep generative model, especially variational auto-encoder (VAE), has been successfully employed by more and more recommendation systems. The reason is that it combines the flexibility of probabilistic generative model with the powerful non-linear feature representation ability of deep neural networks. The existing VAE-based recommendation models are usually proposed under global assumption by incorporating simple priors, e.g., a single Gaussian, to regularize the latent variables. This strategy, however, is ineffective when the user is simultaneously interested in different kinds of items, i.e., the user’s preference may be highly diverse. In this paper, thus, we propose a Deep Global and Local Generative Model for recommendation to consider both local and global structure among users (DGLGM) under the Wasserstein auto-encoder framework. Besides keeping the global structure like the existing model, DGLGM adopts a non-parametric Mixture Gaussian distribution with several components to capture the diversity of the users’ preferences. Each component is corresponding to one local structure and its optimal size can be determined via the automatic relevance determination technique. These two parts can be seamlessly integrated and enhance each other. The proposed DGLGM can be efficiently inferred by minimizing its penalized upper bound with the aid of local variational optimization technique. Meanwhile, we theoretically analyze its generalization error bounds to guarantee its performance in sparse feedback data with diversity. By comparing with the state-of-the-art methods, the experimental results demonstrate that DGLGM consistently benefits the recommendation system in top-N recommendation task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call