Abstract
In recent years, neural networks have been widely used in natural language processing, especially in sentence similarity modeling. Most of the previous studies focused on the current sentence, ignoring the commonsense knowledge related to the current sentence in the task of sentence similarity modeling. Commonsense knowledge can be remarkably useful for understanding the semantics of sentences. CK-Encoder, which can effectively acquire commonsense knowledge to improve the performance of sentence similarity modeling, is proposed in this paper. Specifically, the model first generates a commonsense knowledge graph of the input sentence and calculates this graph by using the graph convolution network. In addition, CKER, a framework combining CK-Encoder and sentence encoder, is introduced. Experiments on two sentence similarity tasks have demonstrated that CK-Encoder can effectively acquire commonsense knowledge to improve the capability of a model to understand sentences.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.