NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

In this paper, we propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives. It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency. On standard benchmark datasets (C10, C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or outperform models from existing approaches with the search being orders of magnitude more sample efficient. Furthermore, we demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets, e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases, NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that NAS can be a viable alternative to conventional transfer learning approaches in handling diverse scenarios such as small-scale or fine-grained datasets. Code is available at https://github.com/mikelzc1990/nsganetv2

PDF Abstract ECCV 2020 PDF ECCV 2020 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search ImageNet NSGANetV2-xl Top-1 Error Rate 19.6 # 17
Accuracy 80.4 # 13
Params 8.7M # 10
MACs 593M # 127
Neural Architecture Search ImageNet NSGANetV2-l Top-1 Error Rate 20.9 # 35
Accuracy 79.1 # 27
Params 8.0M # 12
MACs 400M # 112
Neural Architecture Search ImageNet NSGANetV2-m Top-1 Error Rate 21.7 # 49
Accuracy 78.3 # 38
Params 7.7M # 13
MACs 312M # 90
Neural Architecture Search ImageNet NSGANetV2-s Top-1 Error Rate 22.6 # 64
Accuracy 77.4 # 52
Params 6.1M # 22
MACs 225M # 75
Image Classification STL-10 NSGANetV2 Percentage correct 92.0 # 32

Methods