Abstract

Sentence semantic matching requires understanding the semantic relationship between two sentences, which is widely used in various natural language tasks such as paraphrase identification, natural language inference and question answering. Usually, the underlying structure of sentence is not strictly sequential, this structure is hierarchical. This paper studies how to better model the hierarchical structure of sentence for semantic matching. Recent research suggests that BERT implicitly captures classical, tree-like structures. Based on this, this paper proposes to further enhance the strength of modeling hierarchical structure of language with an advanced variant of LSTM – Ordered Neurons LSTM(ON-LSTM), which introduces a syntax-oriented inductive bias to perform the composition of classical tree structure. Experimental results demonstrate that the proposed approach, which enhances hierarchical structure of language, significantly improves the performance of sentence semantic matching.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call