Neural Architecture Search

774 papers with code • 26 benchmarks • 27 datasets

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Libraries

Use these libraries to find Neural Architecture Search models and implementations
10 papers
2,918
6 papers
1,546
See all 24 libraries.

Latest papers with no code

Network architecture search of X-ray based scientific applications

no code yet • 16 Apr 2024

Our NAS and HPS of (1) BraggNN achieves a 31. 03\% improvement in bragg peak detection accuracy with a 87. 57\% reduction in model size, and (2) PtychoNN achieves a 16. 77\% improvement in model accuracy and a 12. 82\% reduction in model size when compared to the baseline PtychoNN model.

Differentiable Search for Finding Optimal Quantization Strategy

no code yet • 10 Apr 2024

To solve the issue, in this paper, we propose a differentiable quantization strategy search (DQSS) to assign optimal quantization strategy for individual layer by taking advantages of the benefits of different quantization algorithms.

ApproxDARTS: Differentiable Neural Architecture Search with Approximate Multipliers

no code yet • 8 Apr 2024

Integrating the principles of approximate computing into the design of hardware-aware deep neural networks (DNN) has led to DNNs implementations showing good output quality and highly optimized hardware parameters such as low latency or inference energy.

Insights from the Use of Previously Unseen Neural Architecture Search Datasets

no code yet • 2 Apr 2024

The boundless possibility of neural networks which can be used to solve a problem -- each with different performance -- leads to a situation where a Deep Learning expert is required to identify the best neural network.

Mixed-precision Supernet Training from Vision Foundation Models using Low Rank Adapter

no code yet • 29 Mar 2024

To tackle these challenges, first, we study the effective search space design for fine-tuning a VFM by comparing different operators (such as resolution, feature size, width, depth, and bit-widths) in terms of performance and BitOPs reduction.

D'OH: Decoder-Only random Hypernetworks for Implicit Neural Representations

no code yet • 28 Mar 2024

We instead present a strategy for the initialization of run-time deep implicit functions for single-instance signals through a Decoder-Only randomly projected Hypernetwork (D'OH).

emoDARTS: Joint Optimisation of CNN & Sequential Neural Network Architectures for Superior Speech Emotion Recognition

no code yet • 21 Mar 2024

This study presents emoDARTS, a DARTS-optimised joint CNN and Sequential Neural Network (SeqNN: LSTM, RNN) architecture that enhances SER performance.

Robust NAS under adversarial training: benchmark, theory, and beyond

no code yet • 19 Mar 2024

Recent developments in neural architecture search (NAS) emphasize the significance of considering robust architectures against malicious data.

TrajectoryNAS: A Neural Architecture Search for Trajectory Prediction

no code yet • 18 Mar 2024

Through empirical studies, TrajectoryNAS demonstrates its effectiveness in enhancing the performance of autonomous driving systems, marking a significant advancement in the field. Experimental results reveal that TrajcetoryNAS yield a minimum of 4. 8 higger accuracy and 1. 1* lower latency over competing methods on the NuScenes dataset.

Anytime Neural Architecture Search on Tabular Data

no code yet • 15 Mar 2024

This transition demands an efficient and responsive anytime NAS approach that is capable of returning current optimal architectures within any given time budget while progressively enhancing architecture quality with increased budget allocation.