Abstract

Distant supervision (DS) has become an efficient approach for relation extraction (RE) to alleviate the lack of labeled examples in supervised learning. In this paper, we propose a novel neural RE model that combines a bidirectional gated recurrent unit model with a form of hierarchical attention that is better suited to RE. We demonstrate that an additional attention mechanism called piecewise attention, which builds itself upon segment level representations, significantly enhances the performance of the distantly supervised relation extraction task. Our piecewise attention mechanism not only captures crucial segments in each sentence but also reflects the direction of relations between two entities. Furthermore, we propose a contextual inference method that can infer the most likely positive examples of an entity pair in bags with very limited contextual information. In addition, we provide an annotated dataset without false positive examples based on the Riedel testing dataset, and report on the actual performance of several RE models. The experimental results show that our proposed methods outperform the previous state-of-the-art baselines on both original and annotated datasets for the distantly supervised RE task.

Highlights

  • Distant supervision (DS) is a class of weakly supervised methods [1] and has become a popular approach for relation extraction (RE) to alleviate the lack of labeled examples in supervised learning

  • We propose a novel neural RE model that combines a bidirectional gated recurrent unit (BiGRU) sequence model with a form of hierarchical attention that is better suited to RE

  • Our contributions can be summarized as follows: (a) a novel BiGRU model combined with an additional attention mechanism called piecewise attention for distantly supervised RE; (b) a contextual inference method for improving bag label prediction; (c) an annotated dataset of 5,863 sentences,1 which is checked by annotators for false positive examples; and (d) experimental results showing that the proposed models outperform various state-of-the-art baselines

Read more

Summary

Introduction

Distant supervision (DS) is a class of weakly supervised methods [1] and has become a popular approach for relation extraction (RE) to alleviate the lack of labeled examples in supervised learning. DS often introduces noise to the generated training data This approach can generate false positives, as not every mention of an entity pair in a sentence means that a relation is expressed. The original assumption of DS [2] indicated that all sentences containing a known relation (e.g., in Freebase) might be potential true positive relation mentions This assumption is too strong and causes the issue of incorrect labels. Ridel et al [5], Hoffmann et al [6], and Surdeanu et al [7] introduced a series of models casting DS as a multipleinstance learning problem [8] In this multi-instance setting, the training set contains many entity-pair bags, and each bag consists of many relation mentions. Each relation mention is an occurrence of a pair of entities with the source sentence .2 The labels of the bags are known; the labels of the relation mentions in these bags are unknown

Objectives
Methods
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.