Search Results for author: Danny Stoll

Found 8 papers, 5 papers with code

Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars

2 code implementations NeurIPS 2023 Simon Schrodi, Danny Stoll, Binxin Ru, Rhea Sukthanker, Thomas Brox, Frank Hutter

In this work, we introduce a unifying search space design framework based on context-free grammars that can naturally and compactly generate expressive hierarchical search spaces that are 100s of orders of magnitude larger than common spaces from the literature.

Bayesian Optimization Neural Architecture Search

On the Importance of Hyperparameters and Data Augmentation for Self-Supervised Learning

no code implementations16 Jul 2022 Diane Wagner, Fabio Ferreira, Danny Stoll, Robin Tibor Schirrmeister, Samuel Müller, Frank Hutter

Self-Supervised Learning (SSL) has become a very active area of Deep Learning research where it is heavily used as a pre-training method for classification and other tasks.

Bayesian Optimization Data Augmentation +1

$π$BO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization

1 code implementation23 Apr 2022 Carl Hvarfner, Danny Stoll, Artur Souza, Marius Lindauer, Frank Hutter, Luigi Nardi

To address this issue, we propose $\pi$BO, an acquisition function generalization which incorporates prior beliefs about the location of the optimum in the form of a probability distribution, provided by the user.

Bayesian Optimization Hyperparameter Optimization

$\pi$BO: Augmenting Acquisition Functions with User Beliefs for Bayesian Optimization

no code implementations ICLR 2022 Carl Hvarfner, Danny Stoll, Artur Souza, Luigi Nardi, Marius Lindauer, Frank Hutter

To address this issue, we propose $\pi$BO, an acquisition function generalization which incorporates prior beliefs about the location of the optimum in the form of a probability distribution, provided by the user.

Bayesian Optimization Hyperparameter Optimization

Hyperparameter Transfer Across Developer Adjustments

1 code implementation25 Oct 2020 Danny Stoll, Jörg K. H. Franke, Diane Wagner, Simon Selg, Frank Hutter

After developer adjustments to a machine learning (ML) algorithm, how can the results of an old hyperparameter optimization (HPO) automatically be used to speedup a new HPO?

Hyperparameter Optimization

Learning to Design RNA

5 code implementations ICLR 2019 Frederic Runge, Danny Stoll, Stefan Falkner, Frank Hutter

Designing RNA molecules has garnered recent interest in medicine, synthetic biology, biotechnology and bioinformatics since many functional RNA molecules were shown to be involved in regulatory processes for transcription, epigenetics and translation.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.