Abstract

Relation extraction has become a crucial step for the automatic construction of Knowledge Graph (KG). Recently, researchers leverage Sequence-to-Sequence (Seq2Seq) models for Joint Entity and Relation Extraction (JERE). Nevertheless, traditional decoding methods entail the generation of the target sequence incrementally from left to right by the decoder, without the ability to revise earlier predictions when errors occur. This limitation becomes evident when decoding errors manifest prior to the current decoding step. Furthermore, the interrelations among triplets originating from the same sentence exhibit a robust correlation, which has been overlooked. In this paper, we propose Bidirectional Decoding with Co-graph representation (BDCore) to address the issues mentioned above. Specifically, we first introduce a backward decoder to decode the target sequence in a reverse order. Then, the forward decoder introduces two attention mechanisms to simultaneously considering the hidden states of the encoder and the backward decoder. Thus, the backward decoding information helps to alleviate the negative impact of the forward decoding errors. Besides, we construct a relation co-occurrence graph (Co-graph) and exploit Graph Convolutional Network (GCN) to capture the relation correlation. The extensive experiments demonstrate the benefits of the proposed bidirectional decoding and co-graph representation for relation extraction. Compared to the previous methods, our approach significantly outperforms the baselines on the NYT benchmark.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call