Neural Architecture Search
776 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers
FR-NAS: Forward-and-Reverse Graph Predictor for Efficient Neural Architecture Search
Additionally, we incorporate a customized training loss within the GNN predictor to ensure efficient utilization of both types of representations.
Unsupervised Domain Adaptation Architecture Search with Self-Training for Land Cover Mapping
Thus, we proposed a simple yet effective framework to search for lightweight neural networks automatically for land cover mapping tasks under domain shifts.
MobileNetV4 - Universal Models for the Mobile Ecosystem
We present the latest generation of MobileNets, known as MobileNetV4 (MNv4), featuring universally efficient architecture designs for mobile devices.
Shears: Unstructured Sparsity with Neural Low-rank Adapter Search
Recently, several approaches successfully demonstrated that weight-sharing Neural Architecture Search (NAS) can effectively explore a search space of elastic low-rank adapters (LoRA), allowing the parameter-efficient fine-tuning (PEFT) and compression of large language models.
Accel-NASBench: Sustainable Benchmarking for Accelerator-Aware NAS
We present a technique that allows searching for training proxies that reduce the cost of benchmark construction by significant margins, making it possible to construct realistic NAS benchmarks for large-scale datasets.
On Spectrogram Analysis in a Multiple Classifier Fusion Framework for Power Grid Classification Using Electric Network Frequency
The Electric Network Frequency (ENF) serves as a unique signature inherent to power distribution systems.
Neural Architecture Search for Sentence Classification with BERT
Pre training of language models on large text corpora is common practice in Natural Language Processing.
PNAS-MOT: Multi-Modal Object Tracking with Pareto Neural Architecture Search
Multiple object tracking is a critical task in autonomous driving.
Building Optimal Neural Architectures using Interpretable Knowledge
Neural Architecture Search is a costly practice.
Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach
In this work, we analyze the order-preserving ability on the whole search space (global) and a sub-space of top architectures (local), and empirically show that the local order-preserving for current two-stage NAS methods still need to be improved.