Abstract

Question answering has become a very important task for natural language understanding as most natural language processing problems can be posed as a question answering problem. Recurrent neural network (RNN) is a standard baseline model for various sequence prediction tasks including question answering. Recurrent networks can represent global information for a long period of time but they do not preserve local information very well. To address this problem we propose a model which is a combination of recurrent and convolutional network that can be trained end to end using backpropagation. Our experiments on bAbI dataset demonstrate that this model can achieve significant improvement over RNN model for question answering task.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.