Learning Multi-Task Communication with Message Passing for Sequence Learning

Authors

  • Pengfei Liu Fudan University
  • Jie Fu Polytechnique Montreal
  • Yue Dong McGill University
  • Xipeng Qiu Fudan University
  • Jackie Chi Kit Cheung Fudan University

DOI:

https://doi.org/10.1609/aaai.v33i01.33014360

Abstract

We present two architectures for multi-task learning with neural sequence models. Our approach allows the relationships between different tasks to be learned dynamically, rather than using an ad-hoc pre-defined structure as in previous work. We adopt the idea from message-passing graph neural networks, and propose a general graph multi-task learning framework in which different tasks can communicate with each other in an effective and interpretable way. We conduct extensive experiments in text classification and sequence labelling to evaluate our approach on multi-task learning and transfer learning. The empirical results show that our models not only outperform competitive baselines, but also learn interpretable and transferable patterns across tasks.

Downloads

Published

2019-07-17

How to Cite

Liu, P., Fu, J., Dong, Y., Qiu, X., & Kit Cheung, J. C. (2019). Learning Multi-Task Communication with Message Passing for Sequence Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4360-4367. https://doi.org/10.1609/aaai.v33i01.33014360

Issue

Section

AAAI Technical Track: Machine Learning