WAIS: Word Attention for Joint Intent Detection and Slot Filling

  • Sixuan Chen Hong Kong University of Science and Technology
  • Shuai Yu Fudan University

Abstract

Attention-based recurrent neural network models for joint intent detection and slot filling have achieved a state-of-the-art performance. Most previous works exploited semantic level information to calculate the attention weights. However, few works have taken the importance of word level information into consideration. In this paper, we propose WAIS, word attention for joint intent detection and slot filling. Considering that intent detection and slot filling have a strong relationship, we further propose a fusion gate that integrates the word level information and semantic level information together for jointly training the two tasks. Extensive experiments show that the proposed model has robust superiority over its competitors and sets the state-of-the-art.

Published
2019-07-17
Section
Student Abstract Track