Abstract
In this article, we study the problem of dynamic personalized product search. Due to the data-sparsity problem in the real world, existing methods suffer from the challenge of data inefficiency. We address the challenge by proposing a Dynamic Bayesian Contrastive Predictive Coding model (DBCPC), which aims to capture the rich structured information behind search records to improve data efficiency. Our proposed DBCPC utilizes contrastive predictive learning to jointly learn dynamic embeddings with structure information of entities (i.e., users, products, and words). Specifically, our DBCPC employs structured prediction to tackle the intractability caused by non-linear output space and utilizes the time embedding technique to avoid designing different encoders each time in the Dynamic Bayesian models. In this way, our model jointly learns the underlying embeddings of entities (i.e., users, products, and words) via prediction tasks, which enables the embeddings to focus more on their general attributes and capture the general information during the preference evolution with time. For inferring the dynamic embeddings, we propose an inference algorithm combining the variational objective and the contrastive objectives. Experiments were conducted on an Amazon dataset and the experimental results show that our proposed DBCPC can learn the higher-quality embeddings and outperforms the state-of-the-art non-dynamic and dynamic models for product search.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.