Calibrated Stochastic Gradient Descent for Convolutional Neural Networks

Authors

  • Li’an Zhuo Beihang University
  • Baochang Zhang Beihang University
  • Chen Chen University of North Carolina at Charlotte
  • Qixiang Ye University of Chinese Academy of Sciences
  • Jianzhuang Liu Huawei Technologies Company, Ltd.
  • David Doermann State University of New York at Buffalo

DOI:

https://doi.org/10.1609/aaai.v33i01.33019348

Abstract

In stochastic gradient descent (SGD) and its variants, the optimized gradient estimators may be as expensive to compute as the true gradient in many scenarios. This paper introduces a calibrated stochastic gradient descent (CSGD) algorithm for deep neural network optimization. A theorem is developed to prove that an unbiased estimator for the network variables can be obtained in a probabilistic way based on the Lipschitz hypothesis. Our work is significantly distinct from existing gradient optimization methods, by providing a theoretical framework for unbiased variable estimation in the deep learning paradigm to optimize the model parameter calculation. In particular, we develop a generic gradient calibration layer which can be easily used to build convolutional neural networks (CNNs). Experimental results demonstrate that CNNs with our CSGD optimization scheme can improve the stateof-the-art performance for natural image classification, digit recognition, ImageNet object classification, and object detection tasks. This work opens new research directions for developing more efficient SGD updates and analyzing the backpropagation algorithm.

Downloads

Published

2019-07-17

How to Cite

Zhuo, L., Zhang, B., Chen, C., Ye, Q., Liu, J., & Doermann, D. (2019). Calibrated Stochastic Gradient Descent for Convolutional Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 9348-9355. https://doi.org/10.1609/aaai.v33i01.33019348

Issue

Section

AAAI Technical Track: Vision