Neural Architecture Search

776 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,917
6 papers
1,547
See all 24 libraries.

Latest papers with no code

Anytime Neural Architecture Search on Tabular Data

no code yet • 15 Mar 2024

This transition demands an efficient and responsive anytime NAS approach that is capable of returning current optimal architectures within any given time budget while progressively enhancing architecture quality with increased budget allocation.

Chain-structured neural architecture search for financial time series forecasting

no code yet • 15 Mar 2024

We compare three popular neural architecture search strategies on chain-structured search spaces: Bayesian optimization, the hyperband method, and reinforcement learning in the context of financial time series forecasting.

Multiple Population Alternate Evolution Neural Architecture Search

no code yet • 11 Mar 2024

Specifically, the global search space requires a significant amount of computational resources and time, the scalable search space sacrifices the diversity of network structures and the hierarchical search space increases the search cost in exchange for network diversity.

Unsupervised Graph Neural Architecture Search with Disentangled Self-supervision

no code yet • NeurIPS 2023

To address the challenge, we propose a novel Disentangled Self-supervised Graph Neural Architecture Search (DSGAS) model, which is able to discover the optimal architectures capturing various latent graph factors in a self-supervised fashion based on unlabeled graph data.

SWAP-NAS: Sample-Wise Activation Patterns for Ultra-fast NAS

no code yet • 7 Mar 2024

The SWAP-Score is strongly correlated with ground-truth performance across various search spaces and tasks, outperforming 15 existing training-free metrics on NAS-Bench-101/201/301 and TransNAS-Bench-101.

Qubit-Wise Architecture Search Method for Variational Quantum Circuits

no code yet • 7 Mar 2024

Considering the noise level limit, one crucial aspect for quantum machine learning is to design a high-performing variational quantum circuit architecture with small number of quantum gates.

Neural Architecture Search using Particle Swarm and Ant Colony Optimization

no code yet • 6 Mar 2024

A process known as Neural Architecture Search (NAS) may be applied to automatically evaluate a large number of such architectures.

G-EvoNAS: Evolutionary Neural Architecture Search Based on Network Growth

no code yet • 5 Mar 2024

The process begins from a shallow network, grows and evolves, and gradually deepens into a complete network, reducing the search complexity in the global space.

Revisiting Learning-based Video Motion Magnification for Real-time Processing

no code yet • 4 Mar 2024

Video motion magnification is a technique to capture and amplify subtle motion in a video that is invisible to the naked eye.

LeMo-NADe: Multi-Parameter Neural Architecture Discovery with LLMs

no code yet • 28 Feb 2024

Building efficient neural network architectures can be a time-consuming task requiring extensive expert knowledge.