Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition

Authors

  • Yu Pan University of Electronic Science and Technology of China
  • Jing Xu University of Electronic Science and Technology of China
  • Maolin Wang University of Electronic Science and Technology of China
  • Jinmian Ye SMILE Lab
  • Fei Wang Cornell University
  • Kun Bai Tencent Inc
  • Zenglin Xu University of Electronic Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v33i01.33014683

Abstract

Recurrent Neural Networks (RNNs) and their variants, such as Long-Short Term Memory (LSTM) networks, and Gated Recurrent Unit (GRU) networks, have achieved promising performance in sequential data modeling. The hidden layers in RNNs can be regarded as the memory units, which are helpful in storing information in sequential contexts. However, when dealing with high dimensional input data, such as video and text, the input-to-hidden linear transformation in RNNs brings high memory usage and huge computational cost. This makes the training of RNNs very difficult. To address this challenge, we propose a novel compact LSTM model, named as TR-LSTM, by utilizing the low-rank tensor ring decomposition (TRD) to reformulate the input-to-hidden transformation. Compared with other tensor decomposition methods, TR-LSTM is more stable. In addition, TR-LSTM can complete an end-to-end training and also provide a fundamental building block for RNNs in handling large input data. Experiments on real-world action recognition datasets have demonstrated the promising performance of the proposed TR-LSTM compared with the tensor-train LSTM and other state-of-the-art competitors.

Downloads

Published

2019-07-17

How to Cite

Pan, Y., Xu, J., Wang, M., Ye, J., Wang, F., Bai, K., & Xu, Z. (2019). Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4683-4690. https://doi.org/10.1609/aaai.v33i01.33014683

Issue

Section

AAAI Technical Track: Machine Learning