1 code implementation • 28 Mar 2024 • Johann Haselberger, Bonifaz Stuhr, Bernhard Schick, Steffen Müller
Furthermore, we found that feature encoders pretrained on our dataset lead to more precise driving behavior modeling.
1 code implementation • 30 Nov 2023 • Bonifaz Stuhr
Unsupervised representation learning aims at finding methods that learn representations from data without annotation-based signals.
1 code implementation • 22 Sep 2023 • Bonifaz Stuhr, Jürgen Brauer, Bernhard Schick, Jordi Gonzàlez
In this work, we show that masking the inputs of a global discriminator for both domains with a content-based mask is sufficient to reduce content inconsistencies significantly.
1 code implementation • 16 Jun 2022 • Julian Gebele, Bonifaz Stuhr, Johann Haselberger
Unsupervised Domain Adaptation demonstrates great potential to mitigate domain shifts by transferring models from labeled source domains to unlabeled target domains.
Ranked #1 on Domain Adaptation on MuLane
1 code implementation • 4 Sep 2020 • Bonifaz Stuhr, Jürgen Brauer
Thereby we disclose dependencies of the objective function mismatch across several pretext and target tasks with respect to the pretext model's representation size, target model complexity, pretext and target augmentations as well as pretext and target task types.
1 code implementation • 28 Jan 2020 • Bonifaz Stuhr, Jürgen Brauer
This work combines Convolutional Neural Networks (CNNs), clustering via Self-Organizing Maps (SOMs) and Hebbian Learning to propose the building blocks of Convolutional Self-Organizing Neural Networks (CSNNs), which learn representations in an unsupervised and Backpropagation-free manner.