Neural Architecture Search
780 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers
Building Optimal Neural Architectures using Interpretable Knowledge
Neural Architecture Search is a costly practice.
Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach
In this work, we analyze the order-preserving ability on the whole search space (global) and a sub-space of top architectures (local), and empirically show that the local order-preserving for current two-stage NAS methods still need to be improved.
Multi-Objective Evolutionary Neural Architecture Search for Recurrent Neural Networks
Artificial neural network (NN) architecture design is a nontrivial and time-consuming task that often requires a high level of human expertise.
Efficient Multiplayer Battle Game Optimizer for Adversarial Robust Neural Architecture Search
This paper introduces a novel metaheuristic algorithm, known as the efficient multiplayer battle game optimizer (EMBGO), specifically designed for addressing complex numerical optimization tasks.
SpokeN-100: A Cross-Lingual Benchmarking Dataset for The Classification of Spoken Numbers in Different Languages
Benchmarking plays a pivotal role in assessing and enhancing the performance of compact deep learning models designed for execution on resource-constrained devices, such as microcontrollers.
Robustifying and Boosting Training-Free Neural Architecture Search
Nevertheless, the estimation ability of these metrics typically varies across different tasks, making it challenging to achieve robust and consistently good search performance on diverse tasks with only a single training-free metric.
Multi-conditioned Graph Diffusion for Neural Architecture Search
To advance the architecture search, we present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.
ECToNAS: Evolutionary Cross-Topology Neural Architecture Search
We present ECToNAS, a cost-efficient evolutionary cross-topology neural architecture search algorithm that does not require any pre-trained meta controllers.
On Latency Predictors for Neural Architecture Search
We then design a general latency predictor to comprehensively study (1) the predictor architecture, (2) NN sample selection methods, (3) hardware device representations, and (4) NN operation encoding schemes.
Encodings for Prediction-based Neural Architecture Search
Building on our study, we present our predictor \textbf{FLAN}: \textbf{Fl}ow \textbf{A}ttention for \textbf{N}AS.