Abstract
AbstractDependency analysis can better help neural network to capture semantic features in sentences, so as to extract entity relation. Currently, hard pruning strategies and soft pruning strategies based on dependency tree structure coding have been proposed to balance beneficial additional information and adverse interference in extraction tasks. A new model based on graph convolutional networks, which uses a variety of representations describing dependency trees from different perspectives and combining these representations to obtain a better sentence representation for relation classification is proposed. A newly defined module is added, and this module uses the attention mechanism to capture deeper semantic features from the context representation as the global semantic features of the input text, thus helping the model to capture deeper semantic information at the sentence level for relational extraction tasks. In order to get more information about a given entity pair from the input sentence, the authors also model implicit co‐references (references) to entities. This model can extract semantic features related to the relationship between entities from sentences to the maximum extent. The results show that the model in this paper achieves good results on SemEval2010‐Task8 and KBP37 datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IET Cyber-Physical Systems: Theory & Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.