Abstract

Relation extraction is an important task of information extraction. Most existing methods of Chinese language relation extraction are based on word input. They are highly dependent on the quality of word segmentation and suffer from the ambiguity of polysemic words. Therefore, a multi-feature fusion model is presented on the basis of character input, which integrates character-level features, word-level features and entity sense features into deep neural network models. Specifically, to alleviate the ambiguity of polysemy, the entity sense is introduced as external language knowledge to provide supplementary information for understanding the semantics of an entity in a given sentence. The Attention-Based Bidirectional Long Short-Term Memory Networks (Att-BLSTM) are proposed to capture features at the character level. To obtain more structural information, the convolutional layer (C-Att-BLSTM) is built upon the Att-BLSTM to capture features at the word level. Experiments are conducted on a public dataset of SanWen, and show that the proposed model achieves state-of-the-art results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.