Multi-conditioned Graph Diffusion for Neural Architecture Search

Neural architecture search automates the design of neural network architectures usually by exploring a large and thus complex architecture search space. To advance the architecture search, we present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures. We then propose a multi-conditioned classifier-free guidance approach applied to graph diffusion networks to jointly impose constraints such as high accuracy and low hardware latency. Unlike the related work, our method is completely differentiable and requires only a single model training. In our evaluations, we show promising results on six standard benchmarks, yielding novel and unique architectures at a fast speed, i.e. less than 0.2 seconds per architecture. Furthermore, we demonstrate the generalisability and efficiency of our method through experiments on ImageNet dataset.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Neural Architecture Search NAS-Bench-101 DiNAS Accuracy (%) 94.98% # 1
Neural Architecture Search NAS-Bench-201, CIFAR-10 DiNAS Accuracy (Test) 94.37 # 1
Accuracy (Val) 91.61 # 1
Search time (s) 15.36 # 2
Neural Architecture Search NAS-Bench-201, CIFAR-100 DiNAS Accuracy (Test) 73.51 # 1
Accuracy (Val) 73.49 # 1
Search time (s) 15.36 # 2
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 DiNAS Accuracy (Test) 45.41 # 25
Search time (s) 15.36 # 3
Accuracy (Val) 46.66 # 2
Neural Architecture Search NAS-Bench-301 DiNAS Accuracy (Val) 94.92 # 1

Methods