Abstract
Knowledge graph question answering is an important technology in intelligent human–robot interaction, which aims at automatically giving answer to human natural language question with the given knowledge graph. For the multi-relation question with higher variety and complexity, the tokens of the question have different priority for the triples selection in the reasoning steps. Most existing models take the question as a whole and ignore the priority information in it. To solve this problem, we propose question-aware memory network for multi-hop question answering, named QA2MN, to update the attention on question timely in the reasoning process. In addition, we incorporate graph context information into knowledge graph embedding model to increase the ability to represent entities and relations. We use it to initialize the QA2MN model and fine-tune it in the training process. We evaluate QA2MN on PathQuestion and WorldCup2014, two representative datasets for complex multi-hop question answering. The result demonstrates that QA2MN achieves state-of-the-art Hits@1 accuracy on the two datasets, which validates the effectiveness of our model.
Highlights
Intelligent human–robot interaction provides a convenient way for the communication between human and the robots
Question answering over knowledge base (KBQA) is one of the important technologies of intelligent human–robot inter
We have threefold contributions: (i) incorporating graph context information into knowledge graph (KG) embedding model to enhance the representation of entities and relations; (ii) proposing a question-aware attention in the reasoning process to enhance the query update mechanism in keyvalue memory neural network; (iii) achieving state-of-the-art Hits@1 accuracy on two representative datasets and the ablation study demonstrates the interpretability of QA2MN
Summary
Intelligent human–robot interaction provides a convenient way for the communication between human and the robots. The graph context between triplets needs to be modeled to improve the representation of entities and relations [17]. Considering the aforementioned challenges, we enhance key-value memory neural network with KG embedding and question-aware attention, named QA2MN (Question-Aware Memory Network for Question Answering), to improve the representation of tokens in question and the entities and relations in knowledge base. We have threefold contributions: (i) incorporating graph context information into KG embedding model to enhance the representation of entities and relations; (ii) proposing a question-aware attention in the reasoning process to enhance the query update mechanism in keyvalue memory neural network; (iii) achieving state-of-the-art Hits@1 accuracy on two representative datasets and the ablation study demonstrates the interpretability of QA2MN. We end the paper with conclusion and future work in “Conclusion”
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.