Neural Architecture Search
789 papers with code • 26 benchmarks • 27 datasets
Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.
Image Credit : NAS with Reinforcement Learning
Libraries
Use these libraries to find Neural Architecture Search models and implementationsDatasets
Latest papers
Hierarchical Invariance for Robust and Interpretable Vision Tasks at Larger Scales
Developing robust and interpretable vision systems is a crucial step towards trustworthy artificial intelligence.
G-NAS: Generalizable Neural Architecture Search for Single Domain Generalization Object Detection
To address this issue, we propose the Generalizable loss (G-loss), which is an OoD-aware objective, preventing NAS from over-fitting by using gradient descent to optimize parameters not only on a subset of easy-to-learn features but also the remaining predictive features for generalization, and the overall framework is named G-NAS.
Group Distributionally Robust Dataset Distillation with Risk Minimization
However, targeting the training dataset must be thought of as auxiliary in the same sense that the training set is an approximate substitute for the population distribution, and the latter is the data of interest.
AutoGCN -- Towards Generic Human Activity Recognition with Neural Architecture Search
This paper introduces AutoGCN, a generic Neural Architecture Search (NAS) algorithm for Human Activity Recognition (HAR) using Graph Convolution Networks (GCNs).
NACHOS: Neural Architecture Search for Hardware Constrained Early Exit Neural Networks
To this end, this work presents Neural Architecture Search for Hardware Constrained Early Exit Neural Networks (NACHOS), the first NAS framework for the design of optimal EENNs satisfying constraints on the accuracy and the number of Multiply and Accumulate (MAC) operations performed by the EENNs at inference time.
M2-Mixer: A Multimodal Mixer with Multi-head Loss for Classification from Multimodal Data
4 times reduction in training time and + 0. 33% in accuracy and 13.
Automated Fusion of Multimodal Electronic Health Records for Better Medical Predictions
The widespread adoption of Electronic Health Record (EHR) systems in healthcare institutes has generated vast amounts of medical data, offering significant opportunities for improving healthcare services through deep learning techniques.
Élivágar: Efficient Quantum Circuit Search for Classification
Although recent Quantum Circuit Search (QCS) methods attempt to search for performant QML circuits that are also robust to hardware noise, they directly adopt designs from classical Neural Architecture Search (NAS) that are misaligned with the unique constraints of quantum hardware, resulting in high search overheads and severe performance bottlenecks.
SeqNAS: Neural Architecture Search for Event Sequence Classification
As a result of our work we demonstrate that our method surpasses state of the art NAS methods and popular architectures suitable for sequence classification and holds great potential for various industrial applications.
Adaptive Guidance: Training-free Acceleration of Conditional Diffusion Models
Our findings provide insights into the efficiency of the conditional denoising process that contribute to more practical and swift deployment of text-conditioned diffusion models.