Abstract
Click-Through Rate (CTR) prediction is one of the most challenging tasks in the advertising systems, which aims to estimate the probability of a candidate user clicking on a candidate item. Recently, CTR prediction has observed remarkable progress, due to the advances of the Deep Neural network (DNN) in learning representation. However, most of the existing DNN-based methods focus on modeling implicit information from the user aspect, while ignoring the correlations between the user and other objects in the historical interactive behaviors (e.g., user→items or user→users). Additionally, the large-scale historical interactive behaviors also bring the model a lot of difficulty in finding precise relations from redundant and noisy data. Targeted at the above issues, our paper introduces a simple yet effective model, which boosts the user-view and item-view (i.e., dual-view) to extract various interactive relationships for CTR prediction. Therefore, we name our model as Dual-View Attention Network (DVAN). More specifically, we propose a cross-domain attention module (i.e., user→items and item→users) to extract the coarse-level correlations, and a local homogeneous-domain attention module (i.e., user→users and item→items) to extract the fine-level correlations adaptively. In order to extract the accurate relationship from the large-scale historical behaviors, a novel selection mechanism is introduced to connect the cross-domain attention module and the local homogeneous-domain attention module, and thus builds representation from the coarse-level to the fine-level gradually. Experimental results on four public datasets have demonstrated the effectiveness of our method, which outperforms existing state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.