no code implementations • 13 Apr 2023 • Susanne Dandl, Andreas Hofheinz, Martin Binder, Bernd Bischl, Giuseppe Casalicchio
Counterfactual explanation methods provide information on how feature values of individual observations must be changed to obtain a desired prediction.
no code implementations • 15 Jun 2022 • Florian Karl, Tobias Pielok, Julia Moosbauer, Florian Pfisterer, Stefan Coors, Martin Binder, Lennart Schneider, Janek Thomas, Jakob Richter, Michel Lang, Eduardo C. Garrido-Merchán, Juergen Branke, Bernd Bischl
Hyperparameter optimization constitutes a large part of typical modern machine learning workflows.
1 code implementation • 29 Nov 2021 • Julia Moosbauer, Martin Binder, Lennart Schneider, Florian Pfisterer, Marc Becker, Michel Lang, Lars Kotthoff, Bernd Bischl
Automated hyperparameter optimization (HPO) has gained great popularity and is an important ingredient of most automated machine learning frameworks.
no code implementations • 29 Sep 2021 • Hüseyin Anil Gündüz, Martin Binder, Xiao-Yin To, René Mreches, Philipp C. Münch, Alice C McHardy, Bernd Bischl, Mina Rezaei
We introduce Self-GenomeNet, a novel contrastive self-supervised learning method for nucleotide-level genomic data, which substantially improves the quality of the learned representations and performance compared to the current state-of-the-art deep learning frameworks.
1 code implementation • 8 Sep 2021 • Florian Pfisterer, Lennart Schneider, Julia Moosbauer, Martin Binder, Bernd Bischl
When developing and analyzing new hyperparameter optimization methods, it is vital to empirically evaluate and compare them on well-curated benchmark suites.
no code implementations • 13 Jul 2021 • Bernd Bischl, Martin Binder, Michel Lang, Tobias Pielok, Jakob Richter, Stefan Coors, Janek Thomas, Theresa Ullmann, Marc Becker, Anne-Laure Boulesteix, Difan Deng, Marius Lindauer
Most machine learning algorithms are configured by one or several hyperparameters that must be carefully chosen and often considerably impact performance.
no code implementations • ICML Workshop AutoML 2021 • Lennart Schneider, Florian Pfisterer, Martin Binder, Bernd Bischl
Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks.
1 code implementation • 23 Apr 2020 • Susanne Dandl, Christoph Molnar, Martin Binder, Bernd Bischl
We show the usefulness of MOC in concrete cases and compare our approach with state-of-the-art methods for counterfactual explanations.
no code implementations • 30 Dec 2019 • Martin Binder, Julia Moosbauer, Janek Thomas, Bernd Bischl
While model-based optimization needs fewer objective evaluations to achieve good performance, it incurs computational overhead compared to the NSGA-II, so the preferred choice depends on the cost of evaluating a model on given data.