AAAI Publications, Twenty-Eighth AAAI Conference on Artificial Intelligence

Font Size: 
Adaptive Multi-Compositionality for Recursive Neural Models with Applications to Sentiment Analysis
Li Dong, Furu Wei, Ming Zhou, Ke Xu

Last modified: 2014-06-21

Abstract


Recursive neural models have achieved promising results in many natural language processing tasks. The main difference among these models lies in the composition function, i.e., how to obtain the vector representation for a phrase or sentence using the representations of words it contains. This paper introduces a novel Adaptive Multi-Compositionality (AdaMC) layer to recursive neural models. The basic idea is to use more than one composition functions and adaptively select them depending on the input vectors. We present a general framework to model each semantic composition as a distribution over these composition functions. The composition functions and parameters used for adaptive selection are learned jointly from data. We integrate AdaMC into existing recursive neural models and conduct extensive experiments on the Stanford Sentiment Treebank. The results illustrate that AdaMC significantly outperforms state-of-the-art sentiment classification methods. It helps push the best accuracy of sentence-level negative/positive classification from 85.4% up to 88.5%.

Keywords


recursive neural network; semantic composition; deep learning; sentiment classification

Full Text: PDF