Abstract

Social media text can be semantically matched in different ways, viz paraphrase identification, answer selection, community question answering, and so on. The performance of the above semantic matching tasks depends largely on the ability of language modeling. Neural network based language models and probabilistic language models are two main streams of language modeling approaches. However, few prior work has managed to unify them in a single framework on the premise of preserving probabilistic features during the neural network learning process. Motivated by recent advances of quantum-inspired neural networks for text representation learning, we fill the gap by resorting to density matrices, a key concept describing a quantum state as well as a quantum probability distribution. The state and probability views of density matrices are mapped respectively to the neural and probabilistic aspects of language models. Concretizing this state-probability duality to the semantic matching task, we build a unified neural-probabilistic language model through a quantum-inspired neural network. Specifically, we take the state view to construct a density matrix representation of sentence, and exploit its probabilistic nature by extracting its main semantics, which form the basis of a legitimate quantum measurement. When matching two sentences, each sentence is measured against the main semantics of the other. Such a process is implemented in a neural structure, facilitating an end-to-end learning of parameters. The learned density matrix representation reflects an authentic probability distribution over the semantic space throughout the training process. Experiments show that our model significantly outperforms a wide range of prominent classical and quantum-inspired baselines.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.