Efficient Bayesian Task-Level Transfer Learning

Daniel M. Roy, Leslie P. Kaelbling

In this paper, we show how using the Dirichlet Process mixture model as a generative model of data sets provides a simple and effective method for transfer learning. In particular, we present a hierarchical extension of the classic Naive Bayes classifier that couples multiple Naive Bayes classifiers by placing a Dirichlet Process prior over their parameters and show how recent advances in approximate inference in the Dirichlet Process mixture model enable efficient inference. We evaluate the resulting model in a meeting domain, in which the system decides, based on a learned model of the user's behavior, whether to accept or reject the request on his or her behalf. The extended model outperforms the standard Naive Bayes model by using data from other users to influence its predictions.

Subjects: 12. Machine Learning and Discovery; 3.4 Probabilistic Reasoning

Submitted: Oct 16, 2006


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.