Search Results for author: Arber Zela

Found 16 papers, 12 papers with code

Multi-objective Differentiable Neural Architecture Search

1 code implementation28 Feb 2024 Rhea Sanjay Sukthanker, Arber Zela, Benedikt Staffler, Samuel Dooley, Josif Grabocka, Frank Hutter

Pareto front profiling in multi-objective optimization (MOO), i. e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training.

Machine Translation Neural Architecture Search

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

1 code implementation6 Oct 2022 Arjun Krishnakumar, Colin White, Arber Zela, Renbo Tu, Mahmoud Safari, Frank Hutter

Zero-cost proxies (ZC proxies) are a recent architecture performance prediction technique aiming to significantly speed up algorithms for neural architecture search (NAS).

Neural Architecture Search

NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy

1 code implementation ICLR 2022 Yash Mehta, Colin White, Arber Zela, Arjun Krishnakumar, Guri Zabergja, Shakiba Moradian, Mahmoud Safari, Kaicheng Yu, Frank Hutter

The release of tabular benchmarks, such as NAS-Bench-101 and NAS-Bench-201, has significantly lowered the computational overhead for conducting scientific research in neural architecture search (NAS).

Image Classification Neural Architecture Search +4

Multi-headed Neural Ensemble Search

no code implementations9 Jul 2021 Ashwin Raaghav Narayanan, Arber Zela, Tonmoy Saikia, Thomas Brox, Frank Hutter

Ensembles of CNN models trained with different seeds (also known as Deep Ensembles) are known to achieve superior performance over a single copy of the CNN.

Bag of Tricks for Neural Architecture Search

no code implementations8 Jul 2021 Thomas Elsken, Benedikt Staffler, Arber Zela, Jan Hendrik Metzen, Frank Hutter

While neural architecture search methods have been successful in previous years and led to new state-of-the-art performance on various problems, they have also been criticized for being unstable, being highly sensitive with respect to their hyperparameters, and often not performing better than random search.

Neural Architecture Search

How Powerful are Performance Predictors in Neural Architecture Search?

1 code implementation NeurIPS 2021 Colin White, Arber Zela, Binxin Ru, Yang Liu, Frank Hutter

Early methods in the rapidly developing field of neural architecture search (NAS) required fully training thousands of neural networks.

Neural Architecture Search

NASLib: A Modular and Flexible Neural Architecture Search Library

1 code implementation1 Jan 2021 Michael Ruchte, Arber Zela, Julien Niklas Siems, Josif Grabocka, Frank Hutter

Neural Architecture Search (NAS) is one of the focal points for the Deep Learning community, but reproducing NAS methods is extremely challenging due to numerous low-level implementation details.

Neural Architecture Search

Smooth Variational Graph Embeddings for Efficient Neural Architecture Search

2 code implementations9 Oct 2020 Jovita Lukasik, David Friede, Arber Zela, Frank Hutter, Margret Keuper

We evaluate the proposed approach on neural architectures defined by the ENAS approach, the NAS-Bench-101 and the NAS-Bench-201 search space and show that our smooth embedding space allows to directly extrapolate the performance prediction to architectures outside the seen domain (e. g. with more operations).

Bayesian Optimization Neural Architecture Search

Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of Tabular NAS Benchmarks

1 code implementation ICLR 2022 Arber Zela, Julien Siems, Lucas Zimmer, Jovita Lukasik, Margret Keuper, Frank Hutter

We show that surrogate NAS benchmarks can model the true performance of architectures better than tabular benchmarks (at a small fraction of the cost), that they lead to faithful estimates of how well different NAS methods work on the original non-surrogate benchmark, and that they can generate new scientific insight.

Neural Architecture Search

Neural Ensemble Search for Uncertainty Estimation and Dataset Shift

1 code implementation NeurIPS 2021 Sheheryar Zaidi, Arber Zela, Thomas Elsken, Chris Holmes, Frank Hutter, Yee Whye Teh

On a variety of classification tasks and modern architecture search spaces, we show that the resulting ensembles outperform deep ensembles not only in terms of accuracy but also uncertainty calibration and robustness to dataset shift.

Image Classification Neural Architecture Search

NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search

1 code implementation ICLR 2020 Arber Zela, Julien Siems, Frank Hutter

One-shot neural architecture search (NAS) has played a crucial role in making NAS methods computationally feasible in practice.

Benchmarking Neural Architecture Search

Understanding and Robustifying Differentiable Architecture Search

1 code implementation ICLR 2020 Arber Zela, Thomas Elsken, Tonmoy Saikia, Yassine Marrakchi, Thomas Brox, Frank Hutter

Differentiable Architecture Search (DARTS) has attracted a lot of attention due to its simplicity and small search costs achieved by a continuous relaxation and an approximation of the resulting bi-level optimization problem.

Disparity Estimation Image Classification +1

AutoDispNet: Improving Disparity Estimation With AutoML

1 code implementation ICCV 2019 Tonmoy Saikia, Yassine Marrakchi, Arber Zela, Frank Hutter, Thomas Brox

In this work, we show how to use and extend existing AutoML techniques to efficiently optimize large-scale U-Net-like encoder-decoder architectures.

Bayesian Optimization Disparity Estimation +2

Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search

3 code implementations18 Jul 2018 Arber Zela, Aaron Klein, Stefan Falkner, Frank Hutter

While existing work on neural architecture search (NAS) tunes hyperparameters in a separate post-processing step, we demonstrate that architectural choices and other hyperparameter settings interact in a way that can render this separation suboptimal.

Bayesian Optimization Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.