Abstract

Joint extraction of entities and overlapping relations has attracted considerable attention in recent research. Existing relation extraction methods rely on a training set that is labeled by the distant supervision method for supervised relation extraction. However, the drawbacks of these methods are that large-scale unlabeled data cannot be used and the quality of labeled data cannot be guaranteed. Moreover, owing to the relatively complex overlapping relations, it is difficult to perform joint entity-relation extraction accurately. In this study, we propose an end-to-end neural network model (BERT-JEORE) for the joint extraction of entities and overlapping relations. First, we use the BERT-based parameter-sharing layer to capture the joint features of entities and overlapping relations. Then, we implement the source-target BERT model to assign entity labels to each token in a sentence, thereby expanding the amount of labeled data and improving their quality. Finally, we design a three-step overlapping relations extraction model and use it to predict the relations between all entity pairs. Experiments conducted on two public datasets show that BERT-JEORE achieves the best current performance and outperforms the baseline models by a significant margin. Further analysis shows that our model can effectively capture different types of overlapping relational triplets in a sentence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call