MSR-DARTS: Minimum Stable Rank of Differentiable Architecture Search

19 Sep 2020  ·  Kengo Machida, Kuniaki Uto, Koichi Shinoda, Taiji Suzuki ·

In neural architecture search (NAS), differentiable architecture search (DARTS) has recently attracted much attention due to its high efficiency. It defines an over-parameterized network with mixed edges, each of which represents all operator candidates, and jointly optimizes the weights of the network and its architecture in an alternating manner. However, this method finds a model with the weights converging faster than the others, and such a model with fastest convergence often leads to overfitting. Accordingly, the resulting model cannot always be well-generalized. To overcome this problem, we propose a method called minimum stable rank DARTS (MSR-DARTS), for finding a model with the best generalization error by replacing architecture optimization with the selection process using the minimum stable rank criterion. Specifically, a convolution operator is represented by a matrix, and MSR-DARTS selects the one with the smallest stable rank. We evaluated MSR-DARTS on CIFAR-10 and ImageNet datasets. It achieves an error rate of 2.54% with 4.0M parameters within 0.3 GPU-days on CIFAR-10, and a top-1 error rate of 23.9% on ImageNet. The official code is available at https://github.com/mtaecchhi/msrdarts.git.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Neural Architecture Search CIFAR-10 MSR-DARTS Top-1 Error Rate 2.54% # 24
Search Time (GPU days) 0.3 # 11
Parameters 4.0M # 33
Neural Architecture Search ImageNet MSR-DARTS (CIFAR-10) Top-1 Error Rate 23.9 # 91
Accuracy 76.1 # 74
Params 5.6M # 27
MACs 632M # 133

Methods