Neural Architecture Search

771 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,917
6 papers
1,546
See all 24 libraries.

Robustifying and Boosting Training-Free Neural Architecture Search

hzf1174/robot 12 Mar 2024

Nevertheless, the estimation ability of these metrics typically varies across different tasks, making it challenging to achieve robust and consistently good search performance on diverse tasks with only a single training-free metric.

6
12 Mar 2024

Multi-conditioned Graph Diffusion for Neural Architecture Search

rohanasthana/dinas 9 Mar 2024

To advance the architecture search, we present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.

5
09 Mar 2024

ECToNAS: Evolutionary Cross-Topology Neural Architecture Search

elisabethjs/ectonas 8 Mar 2024

We present ECToNAS, a cost-efficient evolutionary cross-topology neural architecture search algorithm that does not require any pre-trained meta controllers.

0
08 Mar 2024

On Latency Predictors for Neural Architecture Search

abdelfattah-lab/nasflat_latency 4 Mar 2024

We then design a general latency predictor to comprehensively study (1) the predictor architecture, (2) NN sample selection methods, (3) hardware device representations, and (4) NN operation encoding schemes.

4
04 Mar 2024

Encodings for Prediction-based Neural Architecture Search

abdelfattah-lab/flan_nas 4 Mar 2024

Building on our study, we present our predictor \textbf{FLAN}: \textbf{Fl}ow \textbf{A}ttention for \textbf{N}AS.

2
04 Mar 2024

NASH: Neural Architecture Search for Hardware-Optimized Machine Learning Models

mfji/nash 4 Mar 2024

We present four versions of the NASH strategy in this paper, all of which show higher accuracy than the original models.

1
04 Mar 2024

DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions

changlin31/DNA 2 Mar 2024

Addressing this problem, we modularize a large search space into blocks with small search spaces and develop a family of models with the distilling neural architecture (DNA) techniques.

230
02 Mar 2024

Parallel Hyperparameter Optimization Of Spiking Neural Network

thomasfirmin/hpo_snn 1 Mar 2024

By defining an early stopping criterion detecting silent networks and by designing specific constraints, we were able to instantiate larger and more flexible search spaces.

1
01 Mar 2024

FlatNAS: optimizing Flatness in Neural Architecture Search for Out-of-Distribution Robustness

ai-tech-research-lab/nas 29 Feb 2024

FlatNAS achieves a good trade-off between performance, OOD generalization, and the number of parameters, by using only in-distribution data in the NAS exploration.

2
29 Feb 2024

Multi-objective Differentiable Neural Architecture Search

automl/modnas 28 Feb 2024

Pareto front profiling in multi-objective optimization (MOO), i. e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training.

2
28 Feb 2024