Abstract
The questions and answers in the question answering (QA) datasets are composed of multiple sentences. This requires the model to be able to find key words in the sentence and find key sentences in the paragraph for question answering classification. The hierarchical attention mechanism is very suitable for such QA classification tasks, focusing on key words at the word level and focusing on key sentences at the sentence level. Inspired by the concept of quantum theory, we introduce the weak measurement under two-state vector formalism for the modeling of word-level attention mechanism (WMATT). We introduce a density matrix for modeling the sentence-level attention mechanism (DMATT) in the model. Our two-way LSTM model is called WMATT-DMATT BiLSTM, which is based on word-level attention mechanism of weak measurement and sentence-level attention mechanism of density matrix. Experiments show that our model is comparable to the benchmark models in experimental results on multiple QA datasets. And the performance of our model achieves the state of the art on the WikiQA dataset compared to the baseline models. From the experimental point of view, the attention mechanism based on weak measurement is suitable for the micro level, corresponding to the word level. The attention mechanism based on the density matrix is suitable for macro level, corresponding to sentence level.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.