Abstract
As a core issue of knowledge graph research, knowledge graph reasoning and completion technology have always been a hot topic of current research. Existing knowledge graph reasoning techniques usually require a large amount of training for each relationship; in addition, training each relationship requires a large number of training samples. Inspired by meta-learning [1], this chapter combines the idea of meta-learning and the attention mechanism [2] for knowledge reasoning. We introduce the few-shot reasoning method based on the attention mechanism, which has superior performance. The novel method greatly reduces the number of samples required for each relationship training, which reduces the scale of the reasoning problem. Second, with the attention mechanism, the proposed method could achieve higher accuracy since historical information could be utilized for reasoning. Third, the method could greatly improve the extensibility of knowledge. When dealing with newly added relationship, the method could easily learn the pattern of the relationship and there is no need to retrain the model.KeywordsMDATAKnowledge reasoningAttention mechanism
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.