Abstract

With the growing availability of different knowledge graphs in a variety of domains, question answering over knowledge graph (KG-QA) becomes a prevalent information retrieval approach. Current KG-QA methods usually resort to semantic parsing, search or neural matching models. However, they cannot well tackle increasingly long input questions and complex information needs. In this work, we propose a new KG-QA approach, leveraging the rich domain context in the knowledge graph. We incorporate the new approach with question and answer domain context descriptions. Specifically, for questions, we enrich them with users’ subsequent input questions within a session and expand the input question representation. For the candidate answers, we equip them with surrounding context structures, i.e., meta-paths within the targeting knowledge graph. On top of these, we design a cross-attention mechanism to improve the question and answer matching performance. An experimental study on real datasets verifies these improvements. The new approach is especially beneficial for specific knowledge graphs with complex questions.

Highlights

  • Recent years have witnessed an information access paradigm shift, from a proactive search to voice/question-oriented automatic answering

  • When the training ratio is 60% in Table 3, the improvement is most significant that the proposed method ranks first at 76.8% and followed by common Tree-LSTM at 72.5%. It proves the usefulness of the meta-path in capturing the contextual structure information in the knowledge graph, as well as the value of domain information in modeling vector for answers

  • Compared to the general model Tree-LSTM, Tree-LSTM can effectively improve the performance in QA tasks over specific-domain knowledge graphs. These quantitative QA experiments reveal that the domain context features are valuable for KG-QA, and the proposed domain context KG-QA is better at specific-domain knowledge graph usages

Read more

Summary

Introduction

Recent years have witnessed an information access paradigm shift, from a proactive search to voice/question-oriented automatic answering. Question answering over knowledge graph (KG-QA) has attracted many attentions [2, 6, 25, 26]. Later in this paper) and set up an online question–answer service on top of it This insurance product knowledge graph has more than 200k triples, consisting of insurance companies, categories, diseases, attributes, terms, etc. In the representation of the answer, existing methods over open-domain knowledge graph usually adopt the translation model, i.e., TransE [5]. These methods embed the entire knowledge graph and obtain the vector representation for entities and relations in the knowledge graph.

Related Work
Proposed Approach
Framework
Input Question Representation
Question Embedding
Question Output
Answer Aspects
Answer Extraction
Answer Output
Answer‐Towards‐Question Attention
Question‐Towards‐Answer Attention
Offline Traning
Online Response
Baselines
Datasets
Quantitative Study of KG‐QA
Answer Context Contribution
Methods
Question Context Contribution
Cross‐attention Models
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.