Neural Architecture Search

780 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,919
6 papers
1,548
See all 24 libraries.

Building Optimal Neural Architectures using Interpretable Knowledge

ascend-research/autobuild 20 Mar 2024

Neural Architecture Search is a costly practice.

1
20 Mar 2024

Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach

beichenzbc/supernet-shifting 18 Mar 2024

In this work, we analyze the order-preserving ability on the whole search space (global) and a sub-space of top architectures (local), and empirically show that the local order-preserving for current two-stage NAS methods still need to be improved.

4
18 Mar 2024

Multi-Objective Evolutionary Neural Architecture Search for Recurrent Neural Networks

reinn-cs/rnn-nas 17 Mar 2024

Artificial neural network (NN) architecture design is a nontrivial and time-consuming task that often requires a high level of human expertise.

1
17 Mar 2024

Efficient Multiplayer Battle Game Optimizer for Adversarial Robust Neural Architecture Search

ruizhong961230/embgo 15 Mar 2024

This paper introduces a novel metaheuristic algorithm, known as the efficient multiplayer battle game optimizer (EMBGO), specifically designed for addressing complex numerical optimization tasks.

0
15 Mar 2024

SpokeN-100: A Cross-Lingual Benchmarking Dataset for The Classification of Spoken Numbers in Different Languages

ankilab/spoken-100 14 Mar 2024

Benchmarking plays a pivotal role in assessing and enhancing the performance of compact deep learning models designed for execution on resource-constrained devices, such as microcontrollers.

0
14 Mar 2024

Robustifying and Boosting Training-Free Neural Architecture Search

hzf1174/robot 12 Mar 2024

Nevertheless, the estimation ability of these metrics typically varies across different tasks, making it challenging to achieve robust and consistently good search performance on diverse tasks with only a single training-free metric.

7
12 Mar 2024

Multi-conditioned Graph Diffusion for Neural Architecture Search

rohanasthana/dinas 9 Mar 2024

To advance the architecture search, we present a graph diffusion-based NAS approach that uses discrete conditional graph diffusion processes to generate high-performing neural network architectures.

7
09 Mar 2024

ECToNAS: Evolutionary Cross-Topology Neural Architecture Search

elisabethjs/ectonas 8 Mar 2024

We present ECToNAS, a cost-efficient evolutionary cross-topology neural architecture search algorithm that does not require any pre-trained meta controllers.

0
08 Mar 2024

On Latency Predictors for Neural Architecture Search

abdelfattah-lab/nasflat_latency 4 Mar 2024

We then design a general latency predictor to comprehensively study (1) the predictor architecture, (2) NN sample selection methods, (3) hardware device representations, and (4) NN operation encoding schemes.

5
04 Mar 2024

Encodings for Prediction-based Neural Architecture Search

abdelfattah-lab/flan_nas 4 Mar 2024

Building on our study, we present our predictor \textbf{FLAN}: \textbf{Fl}ow \textbf{A}ttention for \textbf{N}AS.

3
04 Mar 2024