Guided Dropout

Authors

  • Rohit Keshari Indian Institute of Technology Delhi
  • Richa Singh Indian Institute of Technology Delhi
  • Mayank Vatsa Indian Institute of Technology Delhi

DOI:

https://doi.org/10.1609/aaai.v33i01.33014065

Abstract

Dropout is often used in deep neural networks to prevent over-fitting. Conventionally, dropout training invokes random drop of nodes from the hidden layers of a Neural Network. It is our hypothesis that a guided selection of nodes for intelligent dropout can lead to better generalization as compared to the traditional dropout. In this research, we propose “guided dropout” for training deep neural network which drop nodes by measuring the strength of each node. We also demonstrate that conventional dropout is a specific case of the proposed guided dropout. Experimental evaluation on multiple datasets including MNIST, CIFAR10, CIFAR100, SVHN, and Tiny ImageNet demonstrate the efficacy of the proposed guided dropout.

Downloads

Published

2019-07-17

How to Cite

Keshari, R., Singh, R., & Vatsa, M. (2019). Guided Dropout. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4065-4072. https://doi.org/10.1609/aaai.v33i01.33014065

Issue

Section

AAAI Technical Track: Machine Learning