Frugal Optimization for Cost-related Hyperparameters

4 May 2020  ·  Qingyun Wu, Chi Wang, Silu Huang ·

The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a new cost-frugal HPO solution. The core of our solution is a simple but new randomized direct-search method, for which we prove a convergence rate of $O(\frac{\sqrt{d}}{\sqrt{K}})$ and an $O(d\epsilon^{-2})$-approximation guarantee on the total cost. We provide strong empirical results in comparison with state-of-the-art HPO methods on large AutoML benchmarks.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods