Regularizing Fully Convolutional Networks for Time Series Classification by Decorrelating Filters

Authors

  • Kaushal Paneri Northeastern University
  • Vishnu TV TCS Research
  • Pankaj Malhotra TCS Research
  • Lovekesh Vig TCS Research
  • Gautam Shroff TCS Research

DOI:

https://doi.org/10.1609/aaai.v33i01.330110003

Abstract

Deep neural networks are prone to overfitting, especially in small training data regimes. Often, these networks are overparameterized and the resulting learned weights tend to have strong correlations. However, convolutional networks in general, and fully convolution neural networks (FCNs) in particular, have been shown to be relatively parameter efficient, and have recently been successfully applied to time series classification tasks. In this paper, we investigate the application of different regularizers on the correlation between the learned convolutional filters in FCNs using Batch Normalization (BN) as a regularizer for time series classification (TSC) tasks. Results demonstrate that despite orthogonal initialization of the filters, the average correlation across filters (especially for filters in higher layers) tends to increase as training proceeds, indicating redundancy of filters. To mitigate this redundancy, we propose a strong regularizer, using simple yet effective filter decorrelation. Our proposed method yields significant gains in classification accuracy for 44 diverse time series datasets from the UCR TSC benchmark repository.

Downloads

Published

2019-07-17

How to Cite

Paneri, K., TV, V., Malhotra, P., Vig, L., & Shroff, G. (2019). Regularizing Fully Convolutional Networks for Time Series Classification by Decorrelating Filters. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 10003-10004. https://doi.org/10.1609/aaai.v33i01.330110003

Issue

Section

Student Abstract Track