Neural Architecture Search
779 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers
DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions
Addressing this problem, we modularize a large search space into blocks with small search spaces and develop a family of models with the distilling neural architecture (DNA) techniques.
Parallel Hyperparameter Optimization Of Spiking Neural Network
By defining an early stopping criterion detecting silent networks and by designing specific constraints, we were able to instantiate larger and more flexible search spaces.
FlatNAS: optimizing Flatness in Neural Architecture Search for Out-of-Distribution Robustness
FlatNAS achieves a good trade-off between performance, OOD generalization, and the number of parameters, by using only in-distribution data in the NAS exploration.
Multi-objective Differentiable Neural Architecture Search
Pareto front profiling in multi-objective optimization (MOO), i. e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training.
Hierarchical Invariance for Robust and Interpretable Vision Tasks at Larger Scales
Developing robust and interpretable vision systems is a crucial step towards trustworthy artificial intelligence.
G-NAS: Generalizable Neural Architecture Search for Single Domain Generalization Object Detection
To address this issue, we propose the Generalizable loss (G-loss), which is an OoD-aware objective, preventing NAS from over-fitting by using gradient descent to optimize parameters not only on a subset of easy-to-learn features but also the remaining predictive features for generalization, and the overall framework is named G-NAS.
Group Distributionally Robust Dataset Distillation with Risk Minimization
However, targeting the training dataset must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest.
AutoGCN -- Towards Generic Human Activity Recognition with Neural Architecture Search
This paper introduces AutoGCN, a generic Neural Architecture Search (NAS) algorithm for Human Activity Recognition (HAR) using Graph Convolution Networks (GCNs).
NACHOS: Neural Architecture Search for Hardware Constrained Early Exit Neural Networks
To this end, this work presents Neural Architecture Search for Hardware Constrained Early Exit Neural Networks (NACHOS), the first NAS framework for the design of optimal EENNs satisfying constraints on the accuracy and the number of Multiply and Accumulate (MAC) operations performed by the EENNs at inference time.
M2-Mixer: A Multimodal Mixer with Multi-head Loss for Classification from Multimodal Data
4 times reduction in training time and + 0. 33% in accuracy and 13.