Neural Architecture Search

779 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,916
6 papers
1,548
See all 24 libraries.

DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions

changlin31/DNA 2 Mar 2024

Addressing this problem, we modularize a large search space into blocks with small search spaces and develop a family of models with the distilling neural architecture (DNA) techniques.

230
02 Mar 2024

Parallel Hyperparameter Optimization Of Spiking Neural Network

thomasfirmin/hpo_snn 1 Mar 2024

By defining an early stopping criterion detecting silent networks and by designing specific constraints, we were able to instantiate larger and more flexible search spaces.

1
01 Mar 2024

FlatNAS: optimizing Flatness in Neural Architecture Search for Out-of-Distribution Robustness

ai-tech-research-lab/nas 29 Feb 2024

FlatNAS achieves a good trade-off between performance, OOD generalization, and the number of parameters, by using only in-distribution data in the NAS exploration.

2
29 Feb 2024

Multi-objective Differentiable Neural Architecture Search

automl/modnas 28 Feb 2024

Pareto front profiling in multi-objective optimization (MOO), i. e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training.

3
28 Feb 2024

Hierarchical Invariance for Robust and Interpretable Vision Tasks at Larger Scales

shurenqi/hir 23 Feb 2024

Developing robust and interpretable vision systems is a crucial step towards trustworthy artificial intelligence.

1
23 Feb 2024

G-NAS: Generalizable Neural Architecture Search for Single Domain Generalization Object Detection

wufan-cse/g-nas 7 Feb 2024

To address this issue, we propose the Generalizable loss (G-loss), which is an OoD-aware objective, preventing NAS from over-fitting by using gradient descent to optimize parameters not only on a subset of easy-to-learn features but also the remaining predictive features for generalization, and the overall framework is named G-NAS.

8
07 Feb 2024

Group Distributionally Robust Dataset Distillation with Risk Minimization

mming11/robustdatasetdistillation 7 Feb 2024

However, targeting the training dataset must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest.

5
07 Feb 2024

AutoGCN -- Towards Generic Human Activity Recognition with Neural Architecture Search

deepinmotion/autogcn 2 Feb 2024

This paper introduces AutoGCN, a generic Neural Architecture Search (NAS) algorithm for Human Activity Recognition (HAR) using Graph Convolution Networks (GCNs).

2
02 Feb 2024

NACHOS: Neural Architecture Search for Hardware Constrained Early Exit Neural Networks

ai-tech-research-lab/cnas 24 Jan 2024

To this end, this work presents Neural Architecture Search for Hardware Constrained Early Exit Neural Networks (NACHOS), the first NAS framework for the design of optimal EENNs satisfying constraints on the accuracy and the number of Multiply and Accumulate (MAC) operations performed by the EENNs at inference time.

2
24 Jan 2024