Learning to Embed Sentences Using Attentive Recursive Trees

Authors

  • Jiaxin Shi Tsinghua University
  • Lei Hou Tsinghua University
  • Juanzi Li Tsinghua University
  • Zhiyuan Liu Tsinghua University
  • Hanwang Zhang Nanyang Technological University

DOI:

https://doi.org/10.1609/aaai.v33i01.33016991

Abstract

Sentence embedding is an effective feature representation for most deep learning-based NLP tasks. One prevailing line of methods is using recursive latent tree-structured networks to embed sentences with task-specific structures. However, existing models have no explicit mechanism to emphasize taskinformative words in the tree structure. To this end, we propose an Attentive Recursive Tree model (AR-Tree), where the words are dynamically located according to their importance in the task. Specifically, we construct the latent tree for a sentence in a proposed important-first strategy, and place more attentive words nearer to the root; thus, AR-Tree can inherently emphasize important words during the bottomup composition of the sentence embedding. We propose an end-to-end reinforced training strategy for AR-Tree, which is demonstrated to consistently outperform, or be at least comparable to, the state-of-the-art sentence embedding methods on three sentence understanding tasks.

Downloads

Published

2019-07-17

How to Cite

Shi, J., Hou, L., Li, J., Liu, Z., & Zhang, H. (2019). Learning to Embed Sentences Using Attentive Recursive Trees. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6991-6998. https://doi.org/10.1609/aaai.v33i01.33016991

Issue

Section

AAAI Technical Track: Natural Language Processing