Abstract
Click-through rate (CTR) prediction is essential for targeted advertising and recommendation. At present, machine learning models are widely used to build CTR estimators, including logistic regression (LR), factorization machine (FM), and deep neural network (DNN). Unfortunately, these models adopt the single structure that only considers either low-order feature interactions (such as LR and FM) or high-order feature interactions (such as DNN), not sufficient for CTR prediction. Therefore, the joint learning models are proposed, such as Wide & Deep and DeepFM, which can exploit both high- and low-order feature interactions to predict CTR by combining two different models. In this paper, we first analyze the typical CTR prediction models’ structures and performance, and then summarize the general form and design rules of CTR estimators. Based on the general form, we further design a new joint learning model that combines two different residual networks to explore the feature interactions automatically. Compared with the widely adopted feed-forward neural network, the residual network is more capable of exploring complex feature interactions at different layers. Additionally, we introduce a neural attention network to learn the importance of each second-order interaction of features from various fields. Finally, we evaluate the prediction performance of the proposed model based on two real-world datasets (i.e., Criteo and Avazu) in terms of LogLoss and AUC metrics. The extensive experimental results demonstrate our model performs the best compared to the state-of-the-art baselines.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.