Multi-Fidelity Automatic Hyper-Parameter Tuning via Transfer Series Expansion

  • Yi-Qi Hu Nanjing University
  • Yang Yu Nanjing University
  • Wei-Wei Tu 4Paradigm Inc.
  • Qiang Yang Hong Kong University of Science and Technology
  • Yuqiang Chen 4Paradigm Inc.
  • Wenyuan Dai 4Paradigm Inc.

Abstract

Automatic machine learning (AutoML) aims at automatically choosing the best configuration for machine learning tasks. However, a configuration evaluation can be very time consuming particularly on learning tasks with large datasets. This limitation usually restrains derivative-free optimization from releasing its full power for a fine configuration search using many evaluations. To alleviate this limitation, in this paper, we propose a derivative-free optimization framework for AutoML using multi-fidelity evaluations. It uses many lowfidelity evaluations on small data subsets and very few highfidelity evaluations on the full dataset. However, the lowfidelity evaluations can be badly biased, and need to be corrected with only a very low cost. We thus propose the Transfer Series Expansion (TSE) that learns the low-fidelity correction predictor efficiently by linearly combining a set of base predictors. The base predictors can be obtained cheaply from down-scaled and experienced tasks. Experimental results on real-world AutoML problems verify that the proposed framework can accelerate derivative-free configuration search significantly by making use of the multi-fidelity evaluations.

Published
2019-07-17
Section
AAAI Technical Track: Machine Learning