Neural Architecture Search
776 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers with no code
Anytime Neural Architecture Search on Tabular Data
This transition demands an efficient and responsive anytime NAS approach that is capable of returning current optimal architectures within any given time budget while progressively enhancing architecture quality with increased budget allocation.
Chain-structured neural architecture search for financial time series forecasting
We compare three popular neural architecture search strategies on chain-structured search spaces: Bayesian optimization, the hyperband method, and reinforcement learning in the context of financial time series forecasting.
Multiple Population Alternate Evolution Neural Architecture Search
Specifically, the global search space requires a significant amount of computational resources and time, the scalable search space sacrifices the diversity of network structures and the hierarchical search space increases the search cost in exchange for network diversity.
Unsupervised Graph Neural Architecture Search with Disentangled Self-supervision
To address the challenge, we propose a novel Disentangled Self-supervised Graph Neural Architecture Search (DSGAS) model, which is able to discover the optimal architectures capturing various latent graph factors in a self-supervised fashion based on unlabeled graph data.
SWAP-NAS: Sample-Wise Activation Patterns for Ultra-fast NAS
The SWAP-Score is strongly correlated with ground-truth performance across various search spaces and tasks, outperforming 15 existing training-free metrics on NAS-Bench-101/201/301 and TransNAS-Bench-101.
Qubit-Wise Architecture Search Method for Variational Quantum Circuits
Considering the noise level limit, one crucial aspect for quantum machine learning is to design a high-performing variational quantum circuit architecture with small number of quantum gates.
Neural Architecture Search using Particle Swarm and Ant Colony Optimization
A process known as Neural Architecture Search (NAS) may be applied to automatically evaluate a large number of such architectures.
G-EvoNAS: Evolutionary Neural Architecture Search Based on Network Growth
The process begins from a shallow network, grows and evolves, and gradually deepens into a complete network, reducing the search complexity in the global space.
Revisiting Learning-based Video Motion Magnification for Real-time Processing
Video motion magnification is a technique to capture and amplify subtle motion in a video that is invisible to the naked eye.
LeMo-NADe: Multi-Parameter Neural Architecture Discovery with LLMs
Building efficient neural network architectures can be a time-consuming task requiring extensive expert knowledge.