AlphaNet: Improved Training of Supernets with Alpha-Divergence

16 Feb 2021  ·  Dilin Wang, Chengyue Gong, Meng Li, Qiang Liu, Vikas Chandra ·

Weight-sharing neural architecture search (NAS) is an effective technique for automating efficient neural architecture design. Weight-sharing NAS builds a supernet that assembles all the architectures as its sub-networks and jointly trains the supernet with the sub-networks. The success of weight-sharing NAS heavily relies on distilling the knowledge of the supernet to the sub-networks. However, we find that the widely used distillation divergence, i.e., KL divergence, may lead to student sub-networks that over-estimate or under-estimate the uncertainty of the teacher supernet, leading to inferior performance of the sub-networks. In this work, we propose to improve the supernet training with a more generalized alpha-divergence. By adaptively selecting the alpha-divergence, we simultaneously prevent the over-estimation or under-estimation of the uncertainty of the teacher model. We apply the proposed alpha-divergence based supernets training to both slimmable neural networks and weight-sharing NAS, and demonstrate significant improvements. Specifically, our discovered model family, AlphaNet, outperforms prior-art models on a wide range of FLOPs regimes, including BigNAS, Once-for-All networks, and AttentiveNAS. We achieve ImageNet top-1 accuracy of 80.0% with only 444M FLOPs. Our code and pretrained models are available at https://github.com/facebookresearch/AlphaNet.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search ImageNet AlphaNet-A6 Top-1 Error Rate 19.2 # 12
Accuracy 80.8 # 9
FLOPs 709M # 132
Image Classification ImageNet AlphaNet-A5 Top 1 Accuracy 80.3% # 649
GFLOPs 0.491 # 53
Image Classification ImageNet AlphaNet-A4 Top 1 Accuracy 80.0% # 664
GFLOPs 0.444 # 51
Image Classification ImageNet AlphaNet-A0 Top 1 Accuracy 77.8% # 795
GFLOPs 0.203 # 13
Image Classification ImageNet AlphaNet-A6 Top 1 Accuracy 80.8% # 623
GFLOPs 0.709 # 88
Image Classification ImageNet AlphaNet-A3 Top 1 Accuracy 79.4% # 695
GFLOPs 0.357 # 36
Image Classification ImageNet AlphaNet-A2 Top 1 Accuracy 79.1% # 714
GFLOPs 0.317 # 31
Image Classification ImageNet AlphaNet-A1 Top 1 Accuracy 78.9% # 736
GFLOPs 0.279 # 24
Neural Architecture Search ImageNet AlphaNet-A0 Top-1 Error Rate 22.1 # 54
Accuracy 77.9 # 43
FLOPs 203M # 110
Neural Architecture Search ImageNet AlphaNet-A1 Top-1 Error Rate 21.0 # 37
Accuracy 79.0 # 29
FLOPs 279M # 113
Neural Architecture Search ImageNet AlphaNet-A2 Top-1 Error Rate 20.8 # 34
Accuracy 79.2 # 26
FLOPs 317M # 115
Neural Architecture Search ImageNet AlphaNet-A3 Top-1 Error Rate 20.6 # 33
Accuracy 79.4 # 25
FLOPs 357M # 117
Neural Architecture Search ImageNet AlphaNet-A4 Top-1 Error Rate 20.0 # 23
Accuracy 80.0 # 18
FLOPs 444M # 121
Neural Architecture Search ImageNet AlphaNet-A5 (small) Top-1 Error Rate 19.7 # 19
Accuracy 80.3 # 15
FLOPs 491M # 124
Neural Architecture Search ImageNet AlphaNet-A5 (base) Top-1 Error Rate 19.4 # 14
Accuracy 80.6 # 10
FLOPs 596M # 129

Methods


No methods listed for this paper. Add relevant methods here