Weighted Channel Dropout for Regularization of Deep Convolutional Neural Network

Authors

  • Saihui Hou University of Science and Technology of China
  • Zilei Wang University of Science and Technology of China

DOI:

https://doi.org/10.1609/aaai.v33i01.33018425

Abstract

In this work, we propose a novel method named Weighted Channel Dropout (WCD) for the regularization of deep Convolutional Neural Network (CNN). Different from Dropout which randomly selects the neurons to set to zero in the fully-connected layers, WCD operates on the channels in the stack of convolutional layers. Specifically, WCD consists of two steps, i.e., Rating Channels and Selecting Channels, and three modules, i.e., Global Average Pooling, Weighted Random Selection and Random Number Generator. It filters the channels according to their activation status and can be plugged into any two consecutive layers, which unifies the original Dropout and Channel-Wise Dropout. WCD is totally parameter-free and deployed only in training phase with very slight computation cost. The network in test phase remains unchanged and thus the inference cost is not added at all. Besides, when combining with the existing networks, it requires no re-pretraining on ImageNet and thus is well-suited for the application on small datasets. Finally, WCD with VGGNet-16, ResNet-101, Inception-V3 are experimentally evaluated on multiple datasets. The extensive results demonstrate that WCD can bring consistent improvements over the baselines.

Downloads

Published

2019-07-17

How to Cite

Hou, S., & Wang, Z. (2019). Weighted Channel Dropout for Regularization of Deep Convolutional Neural Network. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 8425-8432. https://doi.org/10.1609/aaai.v33i01.33018425

Issue

Section

AAAI Technical Track: Vision