Abstract
Document-level relation extraction is an essential task in natural language understanding and requires the ability to fully learn different representations of entities and multi-hop reasoning across multiple sentences. Most existing methods use graph neural networks to extract multi-hop relations. They only focus on the role of edge types and ignore node types when obtaining node representations. In addition, it is difficult for them to adequately capture the complex interactions in a document when converting a document into a graph. These problems will affect the performance of relation extraction. To solve the above problems, this paper proposes a document-level relation extraction model based on a multi-layer heterogeneous graph neural network (MHGNN). Different from previous methods that only focus on edge types between two nodes, a node type and edge type oriented heterogeneous graph attention network (NEHGAN) is proposed to further learn rich node representations. In addition, the model constructs three graphs, namely a word-level graph, a mention-level graph and an entity-level graph, for capturing the complex interactions between words, between mentions and between entities in the whole document. Experimental results demonstrate that MHGNN can effectively improve the performance of document-level relation extraction, and outperforms the state-of-the-art model by 1.04%, 1.03%, 1.69% and 0.6% in F1 on the development set and the test set of two public datasets, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.