Graph-Driven Generative Models for Heterogeneous Multi-Task Learning

Authors

  • Wenlin Wang Duke University
  • Hongteng Xu Duke University
  • Zhe Gan Microsoft
  • Bai Li Duke University
  • Guoyin Wang Duke University
  • Liqun Chen Duke University
  • Qian Yang Duke University
  • Wenqi Wang Facebook
  • Lawrence Carin Duke University

DOI:

https://doi.org/10.1609/aaai.v34i01.5446

Abstract

We propose a novel graph-driven generative model, that unifies multiple heterogeneous learning tasks into the same framework. The proposed model is based on the fact that heterogeneous learning tasks, which correspond to different generative processes, often rely on data with a shared graph structure. Accordingly, our model combines a graph convolutional network (GCN) with multiple variational autoencoders, thus embedding the nodes of the graph (i.e., samples for the tasks) in a uniform manner, while specializing their organization and usage to different tasks. With a focus on healthcare applications (tasks), including clinical topic modeling, procedure recommendation and admission-type prediction, we demonstrate that our method successfully leverages information across different tasks, boosting performance in all tasks and outperforming existing state-of-the-art approaches.

Downloads

Published

2020-04-03

How to Cite

Wang, W., Xu, H., Gan, Z., Li, B., Wang, G., Chen, L., Yang, Q., Wang, W., & Carin, L. (2020). Graph-Driven Generative Models for Heterogeneous Multi-Task Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 34(01), 979-988. https://doi.org/10.1609/aaai.v34i01.5446

Issue

Section

AAAI Technical Track: Applications