Neural Architecture Search
771 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers
Robustifying and Boosting Training-Free Neural Architecture Search
Nevertheless, the estimation ability of these metrics typically varies across different tasks, making it challenging to achieve robust and consistently good search performance on diverse tasks with only a single training-free metric.
Multi-conditioned Graph Diffusion for Neural Architecture Search
To advance the architecture search, we present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.
ECToNAS: Evolutionary Cross-Topology Neural Architecture Search
We present ECToNAS, a cost-efficient evolutionary cross-topology neural architecture search algorithm that does not require any pre-trained meta controllers.
On Latency Predictors for Neural Architecture Search
We then design a general latency predictor to comprehensively study (1) the predictor architecture, (2) NN sample selection methods, (3) hardware device representations, and (4) NN operation encoding schemes.
Encodings for Prediction-based Neural Architecture Search
Building on our study, we present our predictor \textbf{FLAN}: \textbf{Fl}ow \textbf{A}ttention for \textbf{N}AS.
NASH: Neural Architecture Search for Hardware-Optimized Machine Learning Models
We present four versions of the NASH strategy in this paper, all of which show higher accuracy than the original models.
DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions
Addressing this problem, we modularize a large search space into blocks with small search spaces and develop a family of models with the distilling neural architecture (DNA) techniques.
Parallel Hyperparameter Optimization Of Spiking Neural Network
By defining an early stopping criterion detecting silent networks and by designing specific constraints, we were able to instantiate larger and more flexible search spaces.
FlatNAS: optimizing Flatness in Neural Architecture Search for Out-of-Distribution Robustness
FlatNAS achieves a good trade-off between performance, OOD generalization, and the number of parameters, by using only in-distribution data in the NAS exploration.
Multi-objective Differentiable Neural Architecture Search
Pareto front profiling in multi-objective optimization (MOO), i. e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training.