Abstract
At present, the mainstream distant supervised relation extraction methods existed problems: the coarse granularity for coding the context feature information; the difficulty in capturing the long-term dependency in the sentence, and the difficulty in coding prior knowledge of structures are major issues. To address these problems, we propose a distant supervised relation extraction model via DiSAN-2CNN on feature level, in which multi-dimension self-attention mechanism is utilized to encode the features of the words and DiSAN-2CNN is used to encode the sentence to obtain the long-term dependency, the prior knowledge of the structure, the time sequence, and the entity dependence in the sentence. Experiments conducted on the NYT-Freebase benchmark dataset demonstrate that the proposed DiSAN-2CNN on a feature level model achieves better performance than the current two state-of-art distant supervised relation extraction models PCNN+ATT and ResCNN-9, and it has d generalization ability with the least artificial feature engineering.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal on Semantic Web and Information Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.