Random Search and Reproducibility for Neural Architecture Search

20 Feb 2019 Liam Li Ameet Talwalkar

Neural architecture search (NAS) is a promising research direction that has the potential to replace expert-designed networks with learned, task-specific architectures. In this work, in order to help ground the empirical results in this field, we propose new NAS baselines that build off the following observations: (i) NAS is a specialized hyperparameter optimization problem; and (ii) random search is a competitive baseline for hyperparameter optimization... (read more)

PDF Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Neural Architecture Search NAS-Bench-201, CIFAR-10 RSPS Accuracy (Test) 87.66 # 6
Accuracy (val) 84.16 # 6
Search time (s) 7587 # 5
Neural Architecture Search NAS-Bench-201, CIFAR-100 RSPS Accuracy (Test) 58.33 # 7
Accuracy (Val) 59.00 # 6
Search time (s) 7587 # 5
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 RSPS Accuracy (Test) 31.14 # 14
Accuracy (val) 31.56 # 12
Search time (s) 7587 # 7

Methods used in the Paper


METHOD TYPE
Random Search
Hyperparameter Search