Abstract
In joint entity and relation extraction, the input document is divided into multiple potential entity regions and context regions, where the characteristics of entities and their relations can often be reflected in the context. Therefore, an effective joint modeling method designed toward the features of different regions can lead to superior performance of joint entity and relation extraction. Previous works tend to implement in-depth modeling only for potential entity regions, ignoring the importance of contextual information for joint entity and relation extraction. In this paper, we propose a Region-based Hypergraph Network (RHGN) for joint entity and relation extraction. The RHGN introduces the concept of regional hypernodes for the first time, and proposes a cooperative method of GCN and BiLSTM to generate hypernodes for each region. Then, a region-based relation hypergraph is constructed for fairly and efficiently aggregate the features of all regions in the sentence. In order to initialize and update the features of the edges and hypernodes in the hypergraph, a Sequence-Enhanced Graph (SEG) unit is designed. Finally, we perform comparison experiments with existing competitive models on three public datasets: the CoNLL04, SciERC and ADE datasets. Experimental results demonstrate that our model achieves a significant improvement over the previous models on both entity recognition and relation extraction, and it also shows superior performance for dataset with nested entities. Extensive additional experiments further confirm the effectiveness of our approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.