AAAI Publications, Thirty-Second AAAI Conference on Artificial Intelligence

Font Size: 
End-to-End Quantum-like Language Models with Application to Question Answering
Peng Zhang, Jiabin Niu, Zhan Su, Benyou Wang, Liqun Ma, Dawei Song

Last modified: 2018-04-27

Abstract


Language Modeling (LM) is a fundamental research topic in a range of areas. Recently, inspired by quantum theory, a novel Quantum Language Model (QLM) has been proposed for Information Retrieval (IR). In this paper, we aim to broaden the theoretical and practical basis of QLM. We develop a Neural Network based Quantum-like Language Model (NNQLM) and apply it to Question Answering. Specifically, based on word embeddings, we design a new density matrix, which represents a sentence (e.g., a question or an answer) and encodes a mixture of semantic subspaces. Such a density matrix, together with a joint representation of the question and the answer, can be integrated into neural network architectures (e.g., 2-dimensional convolutional neural networks). Experiments on the TREC-QA and WIKIQA datasets have verified the effectiveness of our proposed models.

Keywords


Question Answering; Quantum-like Language Model; Neural Network

Full Text: PDF