Neural Architecture Search

789 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,926
6 papers
1,552
See all 24 libraries.

Hierarchical Invariance for Robust and Interpretable Vision Tasks at Larger Scales

shurenqi/hir 23 Feb 2024

Developing robust and interpretable vision systems is a crucial step towards trustworthy artificial intelligence.

1
23 Feb 2024

G-NAS: Generalizable Neural Architecture Search for Single Domain Generalization Object Detection

wufan-cse/g-nas 7 Feb 2024

To address this issue, we propose the Generalizable loss (G-loss), which is an OoD-aware objective, preventing NAS from over-fitting by using gradient descent to optimize parameters not only on a subset of easy-to-learn features but also the remaining predictive features for generalization, and the overall framework is named G-NAS.

8
07 Feb 2024

Group Distributionally Robust Dataset Distillation with Risk Minimization

mming11/robustdatasetdistillation 7 Feb 2024

However, targeting the training dataset must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest.

6
07 Feb 2024

AutoGCN -- Towards Generic Human Activity Recognition with Neural Architecture Search

deepinmotion/autogcn 2 Feb 2024

This paper introduces AutoGCN, a generic Neural Architecture Search (NAS) algorithm for Human Activity Recognition (HAR) using Graph Convolution Networks (GCNs).

2
02 Feb 2024

NACHOS: Neural Architecture Search for Hardware Constrained Early Exit Neural Networks

ai-tech-research-lab/cnas 24 Jan 2024

To this end, this work presents Neural Architecture Search for Hardware Constrained Early Exit Neural Networks (NACHOS), the first NAS framework for the design of optimal EENNs satisfying constraints on the accuracy and the number of Multiply and Accumulate (MAC) operations performed by the EENNs at inference time.

3
24 Jan 2024

Automated Fusion of Multimodal Electronic Health Records for Better Medical Predictions

sh-src/automf 20 Jan 2024

The widespread adoption of Electronic Health Record (EHR) systems in healthcare institutes has generated vast amounts of medical data, offering significant opportunities for improving healthcare services through deep learning techniques.

3
20 Jan 2024

Élivágar: Efficient Quantum Circuit Search for Classification

sashwatanagolum/elivagar 17 Jan 2024

Although recent Quantum Circuit Search (QCS) methods attempt to search for performant QML circuits that are also robust to hardware noise, they directly adopt designs from classical Neural Architecture Search (NAS) that are misaligned with the unique constraints of quantum hardware, resulting in high search overheads and severe performance bottlenecks.

2
17 Jan 2024

SeqNAS: Neural Architecture Search for Event Sequence Classification

On-Point-RND/SeqNAS 6 Jan 2024

As a result of our work we demonstrate that our method surpasses state of the art NAS methods and popular architectures suitable for sequence classification and holds great potential for various industrial applications.

4
06 Jan 2024

Adaptive Guidance: Training-free Acceleration of Conditional Diffusion Models

haozheliu-st/t-gate 19 Dec 2023

Our findings provide insights into the efficiency of the conditional denoising process that contribute to more practical and swift deployment of text-conditioned diffusion models.

220
19 Dec 2023