Browse SoTA > Methodology > AutoML > Neural Architecture Search

Neural Architecture Search

270 papers with code ยท Methodology
Subtask of AutoML

Benchmarks

Latest papers without code

NASLib: A Modular and Flexible Neural Architecture Search Library

ICLR 2021

Neural Architecture Search (NAS) is one of the focal points for the Deep Learning community, but reproducing NAS methods is extremely challenging due to numerous low-level implementation details.

NEURAL ARCHITECTURE SEARCH

NAS-Bench-ASR: Reproducible Neural Architecture Search for Speech Recognition

ICLR 2021

These datasets, however, focus predominantly on computer vision and NLP tasks and thus suffer from the problem of limited coverage of application domains.

NEURAL ARCHITECTURE SEARCH SPEECH RECOGNITION

TRACE: Tensorizing and Generalizing Supernets from Neural Architecture Search

ICLR 2021

Recently, a special kind of graph, i. e., supernet, which allows two nodes connected by multi-choice edges, has exhibited its power in neural architecture search (NAS) by searching better architectures for computer vision (CV) and natural language processing (NLP) tasks.

KNOWLEDGE GRAPHS NEURAL ARCHITECTURE SEARCH

Don't be picky, all students in the right family can learn from good teachers

ICLR 2021

State-of-the-art results in deep learning have been improving steadily, in good part due to the use of larger models.

NEURAL ARCHITECTURE SEARCH

Searching for Convolutions and a More Ambitious NAS

ICLR 2021

An important goal of neural architecture search (NAS) is to automate-away the design of neural networks on new tasks in under-explored domains, thus helping to democratize machine learning.

NEURAL ARCHITECTURE SEARCH

Uniform-Precision Neural Network Quantization via Neural Channel Expansion

ICLR 2021

Uniform-precision neural network quantization has gained popularity thanks to its simple arithmetic unit densely packed for high computing capability.

NEURAL ARCHITECTURE SEARCH QUANTIZATION

Efficient Graph Neural Architecture Search

ICLR 2021

To obtain state-of-the-art (SOAT) data-specific GNN architectures, researchers turn to the neural architecture search (NAS) methods.

NEURAL ARCHITECTURE SEARCH TRANSFER LEARNING

Sandwich Batch Normalization

ICLR 2021

Its variants include further decomposing the normalization layer into multiple parallel ones, and extending similar ideas to instance normalization.

ADVERSARIAL DEFENSE CONDITIONAL IMAGE GENERATION NEURAL ARCHITECTURE SEARCH STYLE TRANSFER

SACoD: Sensor Algorithm Co-Design Towards Efficient CNN-powered Intelligent PhlatCam

ICLR 2021

PhlatCam, with its form factor potentially reduced by orders of magnitude, has emerged as a promising solution to the first aforementioned challenge, while the second one remains a bottleneck.

MODEL COMPRESSION NEURAL ARCHITECTURE SEARCH

Rethinking Architecture Selection in Differentiable NAS

ICLR 2021

Differentiable Neural Architecture Search is one of the most popular Neural Architecture Search (NAS) method for its search efficiency and simplicity, accomplished by jointly optimizing the model weight and architecture parameters in a weight-sharing supernet via gradient-based algorithms.

NEURAL ARCHITECTURE SEARCH