Abstract

Natural language inference (NLI) is a challenge and the foundation of realization of artificial intelligence. With the availability of large-scale annotated corpus, the neural network model can be widely used in natural language inference. In this paper, we propose a long short-term memory (LSTM) model with Sentence Fusion architecture for NLI task. Instead of modifying the internal structure of the LSTM recurrent neural network (RNN) model, we focused on how to make full use of the distributed expression of sentence generated by the LSTM encoder. We improved the performance of basic LSTM recurrent neural networks on Stanford natural language inference (SNLI) corpus by adding Sentence Fusion modules which enrich the distributed expression of sentence generated by the LSTM. Our results demonstrate that the LSTM with Sentence Fusion which reads premise and hypothesis to produce a final fusion representation from which a three-way classifier predicts label has a better performance than LSTM RNN encoders and Lexicalized classifier.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call