Abstract

Deep network recommendation is a cutting-edge topic in current recommendation system research, which as a combination of recommendation systems and deep learning theory can effectively improve recommendation accuracy. In a real recommendation scenario, all the effective information in a data set should be extracted, both explicit and implicit, because the comprehensive degree of information is proportional to the recommendation performance. This article proposes an enhanced multi-modal recommendation based on alternate training with knowledge graph representation (SI-MKR) based on the MKR deep learning recommendation model. Our framework is an enhanced recommendation system based on knowledge graph representation, using valuable external knowledge as multi-modal information. The SI-MKR model solves the problem of ignoring the diversity of data types in the multi-modal knowledge-based recommendation system, which adds user and item attribute information from a knowledge graph as an enhancement recommendation multi-tasking training. By analysing the content of the item and user attributes, the SI-MKR model classifies the attributes of the items and users, processes the text type attributes and multi-value type attributes separately for feature extraction, and other types of attributes are used as inputs to the knowledge graph embedding unit. In addition, the knowledge graph data form a triplet unit, thus continuing the knowledge graph data training process. The feature extraction unit of the knowledge graph and the recommended unit are connected through the cross-compression unit for alternate training. During the deep learning framework training process, the recommendation system’s item has a potential correlation with the head entity in the knowledge graph which embodies the idea of multi-tasking. Through extensive experiments on real-world datasets, we demonstrate that SI-MKR achieves substantial gains in movie recommendation over advanced model baselines. Even user-item interactions are sparse, SI-MKR maintains better performance than the MKR model.

Highlights

  • With the development of the digital age, the amount of data has grown explosively

  • All attributes are added to the knowledge graph embedding (KGE) unit, training is conducted in the knowledge graph unit, and only project-user-rating is used as the RS unit's training input

  • To investigate the efficacy of the KGE module in sparse scenarios, we vary the ratio of the training set of MovieLens1M from 100% to 20% and report the results of AUC in click-through rate (CTR) prediction for all methods

Read more

Summary

INTRODUCTION

With the development of the digital age, the amount of data has grown explosively. Extracting useful information from massive amount of data has become a popular research topic. A traditional recommendation system can only conduct collaborative filtering recommendations based on whether users click on the news when the news are published, or the news collection the user clicks on[3] This method cannot deeply ascertain the potential content from the news site. Many researchers have integrated other technologies into recommendation systems to deeply extract the features of users and items. The data mainly includes sequences[4], graphs[5] or other construction methods. A KG is a heterogeneous graph where nodes function as entities and edges represent relations between entities Items and their attributes can be mapped into the KG to understand the mutual relations between items[10]. Knowledge graphs contain rich semantic associations between entities and provide a potential source of multimodal information for recommendation systems. (2) A knowledge graph can connect user history and recommendation results to improve user satisfaction and acceptance of recommendation results and enhance user trust in the recommendation system

MULTI-MODAL INFORMATION
DATASETS
Father of the Bride Comedy part 2
RESULTS
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call