Neural Architecture Search

779 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,918
6 papers
1,548
See all 24 libraries.

Latest papers with no code

D'OH: Decoder-Only random Hypernetworks for Implicit Neural Representations

no code yet • 28 Mar 2024

We instead present a strategy for the initialization of run-time deep implicit functions for single-instance signals through a Decoder-Only randomly projected Hypernetwork (D'OH).

emoDARTS: Joint Optimisation of CNN & Sequential Neural Network Architectures for Superior Speech Emotion Recognition

no code yet • 21 Mar 2024

This study presents emoDARTS, a DARTS-optimised joint CNN and Sequential Neural Network (SeqNN: LSTM, RNN) architecture that enhances SER performance.

Robust NAS under adversarial training: benchmark, theory, and beyond

no code yet • 19 Mar 2024

Recent developments in neural architecture search (NAS) emphasize the significance of considering robust architectures against malicious data.

TrajectoryNAS: A Neural Architecture Search for Trajectory Prediction

no code yet • 18 Mar 2024

Through empirical studies, TrajectoryNAS demonstrates its effectiveness in enhancing the performance of autonomous driving systems, marking a significant advancement in the field. Experimental results reveal that TrajcetoryNAS yield a minimum of 4. 8 higger accuracy and 1. 1* lower latency over competing methods on the NuScenes dataset.

Anytime Neural Architecture Search on Tabular Data

no code yet • 15 Mar 2024

This transition demands an efficient and responsive anytime NAS approach that is capable of returning current optimal architectures within any given time budget while progressively enhancing architecture quality with increased budget allocation.

Chain-structured neural architecture search for financial time series forecasting

no code yet • 15 Mar 2024

We compare three popular neural architecture search strategies on chain-structured search spaces: Bayesian optimization, the hyperband method, and reinforcement learning in the context of financial time series forecasting.

Multiple Population Alternate Evolution Neural Architecture Search

no code yet • 11 Mar 2024

Specifically, the global search space requires a significant amount of computational resources and time, the scalable search space sacrifices the diversity of network structures and the hierarchical search space increases the search cost in exchange for network diversity.

Unsupervised Graph Neural Architecture Search with Disentangled Self-supervision

no code yet • NeurIPS 2023

To address the challenge, we propose a novel Disentangled Self-supervised Graph Neural Architecture Search (DSGAS) model, which is able to discover the optimal architectures capturing various latent graph factors in a self-supervised fashion based on unlabeled graph data.

SWAP-NAS: Sample-Wise Activation Patterns for Ultra-fast NAS

no code yet • 7 Mar 2024

The SWAP-Score is strongly correlated with ground-truth performance across various search spaces and tasks, outperforming 15 existing training-free metrics on NAS-Bench-101/201/301 and TransNAS-Bench-101.

Qubit-Wise Architecture Search Method for Variational Quantum Circuits

no code yet • 7 Mar 2024

Considering the noise level limit, one crucial aspect for quantum machine learning is to design a high-performing variational quantum circuit architecture with small number of quantum gates.