AAAI Publications, Twenty-Eighth AAAI Conference on Artificial Intelligence

Font Size: 
Instance-Based Domain Adaptation in NLP via In-Target-Domain Logistic Approximation
Rui Xia, Jianfei Yu, Feng Xu, Shumei Wang

Last modified: 2014-06-21

Abstract


In the field of NLP, most of the existing domain adaptation studies belong to the feature-based adaptation, while the research of instance-based adaptation is very scarce. In this work, we propose a new instance-based adaptation model, called in-target-domain logistic approximation (ILA). In ILA, we adapt the source-domain data to the target domain by a logistic approximation. The normalized in-target-domain probability is assigned as an instance weight to each of the source-domain training data. An instance-weighted classification model is trained finally for the cross-domain classification problem. Compared to the previous techniques, ILA conducts instance adaptation in a dimensionality-reduced linear feature space to ensure efficiency in high-dimensional NLP tasks. The instance weights in ILA are learnt by leveraging the criteria of both maximum likelihood and minimum statistical distance. The empirical results on two NLP tasks including text categorization and sentiment classification show that our ILA model beats the state-of-the-art instance adaptation methods significantly, in cross-domain classification accuracy, parameter stability and computational efficiency.

Keywords


domain adaptation; instance adaptation; transfer learning

Full Text: PDF