Abstract

The questions and answers in the question answering (QA) datasets are composed of multiple sentences. This requires the model to be able to find key words in the sentence and find key sentences in the paragraph for question answering classification. The hierarchical attention mechanism is very suitable for such QA classification tasks, focusing on key words at the word level and focusing on key sentences at the sentence level. Inspired by the concept of quantum theory, we introduce the weak measurement under two-state vector formalism for the modeling of word-level attention mechanism (WMATT). We introduce a density matrix for modeling the sentence-level attention mechanism (DMATT) in the model. Our two-way LSTM model is called WMATT-DMATT BiLSTM, which is based on word-level attention mechanism of weak measurement and sentence-level attention mechanism of density matrix. Experiments show that our model is comparable to the benchmark models in experimental results on multiple QA datasets. And the performance of our model achieves the state of the art on the WikiQA dataset compared to the baseline models. From the experimental point of view, the attention mechanism based on weak measurement is suitable for the micro level, corresponding to the word level. The attention mechanism based on the density matrix is suitable for macro level, corresponding to sentence level.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call