no code implementations • 2 Feb 2024 • Emanuel Sommer, Lisa Wimmer, Theodore Papamarkou, Ludwig Bothmann, Bernd Bischl, David Rügamer
A major challenge in sample-based inference (SBI) for Bayesian neural networks is the size and structure of the networks' parameter space.
no code implementations • 30 Dec 2023 • Yusuf Sale, Paul Hofman, Lisa Wimmer, Eyke Hüllermeier, Thomas Nagler
Uncertainty quantification is a critical aspect of machine learning models, providing important insights into the reliability of predictions and aiding the decision-making process in real-world applications.
no code implementations • 5 Sep 2023 • Amirhossein Vahidi, Simon Schoßer, Lisa Wimmer, Yawei Li, Bernd Bischl, Eyke Hüllermeier, Mina Rezaei
In this paper, we propose a novel probabilistic self-supervised learning via Scoring Rule Minimization (ProSMIN), which leverages the power of probabilistic models to enhance representation quality and mitigate collapsing representations.
no code implementations • 28 Aug 2023 • Amirhossein Vahidi, Lisa Wimmer, Hüseyin Anil Gündüz, Bernd Bischl, Eyke Hüllermeier, Mina Rezaei
Ensembling a neural network is a widely recognized approach to enhance model performance, estimate uncertainty, and improve robustness in deep supervised learning.
no code implementations • 6 Apr 2023 • Jonas Gregor Wiese, Lisa Wimmer, Theodore Papamarkou, Bernd Bischl, Stephan Günnemann, David Rügamer
Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density landscape.
2 code implementations • 28 Mar 2023 • Ludwig Bothmann, Lisa Wimmer, Omid Charrakh, Tobias Weber, Hendrik Edelhoff, Wibke Peters, Hien Nguyen, Caryl Benjamin, Annette Menzel
(2) We provide an active learning (AL) system that allows training deep learning models very efficiently in terms of required human-labeled training images.
1 code implementation • 7 Sep 2022 • Lisa Wimmer, Yusuf Sale, Paul Hofman, Bern Bischl, Eyke Hüllermeier
The quantification of aleatoric and epistemic uncertainty in terms of conditional entropy and mutual information, respectively, has recently become quite common in machine learning.