Abstract

AbstractIn the field of distant supervision relation extraction, PCNN (piecewise convolution neural network) is normally involved to trap local features of sentences, and has achieved good results. However, the existing PCNN-based methods are unable to capture the features of long-distance interdependence in sentences and cannot distinguish the influence of the three segments of PCNN on relation classification, which probably results in missing some vital information. To address the above two issues, we come up with a new model with multi-head self-attention and gate mechanism for distant supervised relation extraction. Firstly, to capture the features of long-distance interdependence in sentences, we employ an internal multi-head self-attention in PCNN model, which can grab information in different representation subspaces. Secondly, to distinguish the influence of the three segments of the piecewise max-pooling output on the relation classification, the gate mechanism is introduced to assign different weights to the three segments and highlight important segments. After a series of experiments, it is proven that the model we presented is better than the previous approaches on each evaluation criterion.KeywordsMulti-head self-attentionGate mechanismRelation extractionDistantly supervised

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.