Neural Architecture Search

776 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,917
6 papers
1,547
See all 24 libraries.

FR-NAS: Forward-and-Reverse Graph Predictor for Efficient Neural Architecture Search

emi-group/fr-nas 24 Apr 2024

Additionally, we incorporate a customized training loss within the GNN predictor to ensure efficient utilization of both types of representations.

2
24 Apr 2024

Unsupervised Domain Adaptation Architecture Search with Self-Training for Land Cover Mapping

cliffbb/uda-nas 23 Apr 2024

Thus, we proposed a simple yet effective framework to search for lightweight neural networks automatically for land cover mapping tasks under domain shifts.

0
23 Apr 2024

MobileNetV4 - Universal Models for the Mobile Ecosystem

tensorflow/models 16 Apr 2024

We present the latest generation of MobileNets, known as MobileNetV4 (MNv4), featuring universally efficient architecture designs for mobile devices.

76,589
16 Apr 2024

Shears: Unstructured Sparsity with Neural Low-rank Adapter Search

intellabs/hardware-aware-automated-machine-learning 16 Apr 2024

Recently, several approaches successfully demonstrated that weight-sharing Neural Architecture Search (NAS) can effectively explore a search space of elastic low-rank adapters (LoRA), allowing the parameter-efficient fine-tuning (PEFT) and compression of large language models.

10
16 Apr 2024

Accel-NASBench: Sustainable Benchmarking for Accelerator-Aware NAS

afzalxo/accel-nasbench 9 Apr 2024

We present a technique that allows searching for training proxies that reduce the cost of benchmark construction by significant margins, making it possible to construct realistic NAS benchmarks for large-scale datasets.

0
09 Apr 2024

On Spectrogram Analysis in a Multiple Classifier Fusion Framework for Power Grid Classification Using Electric Network Frequency

georgejolo/enfusion 27 Mar 2024

The Electric Network Frequency (ENF) serves as a unique signature inherent to power distribution systems.

0
27 Mar 2024

Neural Architecture Search for Sentence Classification with BERT

themody/nasforsentenceembeddingheads 27 Mar 2024

Pre training of language models on large text corpora is common practice in Natural Language Processing.

0
27 Mar 2024

PNAS-MOT: Multi-Modal Object Tracking with Pareto Neural Architecture Search

pholypeng/pnas-mot 23 Mar 2024

Multiple object tracking is a critical task in autonomous driving.

2
23 Mar 2024

Building Optimal Neural Architectures using Interpretable Knowledge

ascend-research/autobuild 20 Mar 2024

Neural Architecture Search is a costly practice.

1
20 Mar 2024

Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach

beichenzbc/supernet-shifting 18 Mar 2024

In this work, we analyze the order-preserving ability on the whole search space (global) and a sub-space of top architectures (local), and empirically show that the local order-preserving for current two-stage NAS methods still need to be improved.

2
18 Mar 2024