Abstract

In relation extraction with distant supervision, noise labels are a bottleneck problem that hinders the performance of training models. Existing neural models solved this problem using attention mechanisms or multi-instance/multi-label learning to select sentences that are likely to express the relations. They mainly focused on the structural information of entity pairs and ignored semantic information of relation-clue words. Furthermore, these models do not consider the semantic scenario information of an entity pair (As the same entity pair may express different relations in different semantic scenarios). To bridge the gap, this paper proposes a novel model for relation extraction based on frame-semantics, namely Context-aware based on Frame-semantics for Distantly Supervised Relation Extraction (CFSRE). The model combines a hierarchical neural network architecture with FrameNet semantic knowledge base to solve the noise labels and the semantic representation of entity pairs. More specifically, a simple and effective instance selection method is used to select informative positive instances for model training. In addition, we propose a novel sentence representation method by combining sentence context representation with frame semantic representation of entities. We find that the joint representation leads to a better performance because it can obtain more comprehensive semantic representation of text instances. Then, a hierarchical-attention mechanism has been designed to select the most informative features for relation extraction. Finally, a softmax classifier is used to test the performance of the model. Experiments were conducted on two widely used benchmark datasets. The results demonstrate that the proposed method is significantly better than state-of-the-art methods for distantly supervised relation extraction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.