no code implementations • 15 May 2024 • Maximilian Schmidt, Andrea Bartezzaghi, Ngoc Thang Vu
With this motivation, we show that using large language models can improve Question Answering performance on various datasets in the few-shot setting compared to state-of-the-art approaches.
no code implementations • 2 Apr 2024 • Rudolf Herdt, Maximilian Schmidt, Daniel Otero Baguer, Peter Maaß
In this work, we investigate methods to reduce the noise in deep saliency maps coming from convolutional downsampling, with the purpose of explaining how a deep learning model detects tumors in scanned histological tissue samples.
no code implementations • 5 Mar 2024 • Tina Vartziotis, Ippolyti Dellatolas, George Dasoulas, Maximilian Schmidt, Florian Schneider, Tim Hoffmann, Sotirios Kotsopoulos, Michael Keckeisen
Within our methodology, in order to quantify the sustainability awareness of these AI models, we propose a definition of the code's "green capacity", based on certain sustainability metrics.
no code implementations • 4 Feb 2023 • Rudolf Herdt, Maximilian Schmidt, Daniel Otero Baguer, Jean Le'Clerc Arrastia, Peter Maass
In this work, we propose a fast and accurate method to reconstruct activations of classification and semantic segmentation networks by stitching them with a GAN generator utilizing a 1x1 convolution.
1 code implementation • 27 Nov 2022 • Maximilian Schmidt, Andrea Bartezzaghi, Jasmina Bogojeska, A. Cristiano I. Malossi, Thang Vu
Furthermore, they often yield very good performance but only in the domain they were trained on.
3 code implementations • 23 Nov 2021 • Riccardo Barbano, Johannes Leuschner, Maximilian Schmidt, Alexander Denker, Andreas Hauptmann, Peter Maaß, Bangti Jin
Deep image prior (DIP) was recently introduced as an effective unsupervised approach for image restoration tasks.
2 code implementations • 26 Oct 2021 • Alexander Denker, Maximilian Schmidt, Johannes Leuschner, Peter Maass
Over the last years, deep learning methods have become an increasingly popular choice to solve tasks from the field of inverse problems.
no code implementations • 8 Feb 2021 • Henrik D. Mettler, Maximilian Schmidt, Walter Senn, Mihai A. Petrovici, Jakob Jordan
We formulate the search for phenomenological models of synaptic plasticity as an optimization problem.
2 code implementations • 28 May 2020 • Jakob Jordan, Maximilian Schmidt, Walter Senn, Mihai A. Petrovici
Continuous adaptation allows survival in an ever-changing world.
Neurons and Cognition
1 code implementation • ACL 2020 • Chia-Yu Li, Daniel Ortega, Dirk Väth, Florian Lux, Lindsey Vanderlyn, Maximilian Schmidt, Michael Neumann, Moritz Völkel, Pavel Denisov, Sabrina Jenne, Zorica Kacarevic, Ngoc Thang Vu
We present ADVISER - an open-source, multi-domain dialog system toolkit that enables the development of multi-modal (incorporating speech, text and vision), socially-engaged (e. g. emotion recognition, engagement level prediction and backchanneling) conversational agents.
1 code implementation • 10 Mar 2020 • Daniel Otero Baguer, Johannes Leuschner, Maximilian Schmidt
In this work, we investigate the application of deep learning methods for computed tomography in the context of having a low-data regime.
no code implementations • 10 Dec 2019 • Christian Etmann, Maximilian Schmidt, Jens Behrmann, Tobias Boskamp, Lena Hauberg-Lotte, Annette Peter, Rita Casadonte, Jörg Kriegsmann, Peter Maass
Neural networks have recently been established as a viable classification method for imaging mass spectrometry data for tumor typing.
1 code implementation • 1 Oct 2019 • Johannes Leuschner, Maximilian Schmidt, Daniel Otero Baguer, Peter Maaß
Deep Learning approaches for solving Inverse Problems in imaging have become very effective and are demonstrated to be quite competitive in the field.
no code implementations • ACL 2019 • Daniel Ortega, Dirk V{\"a}th, Gianna Weber, V, Lindsey erlyn, Maximilian Schmidt, Moritz V{\"o}lkel, Zorica Karacevic, Ngoc Thang Vu
In this paper, we present ADVISER - an open source dialog system framework for education and research purposes.
no code implementations • 17 Jun 2019 • Maximilian Schmidt, Marko Simic
Flow-based deep generative models learn data distributions by transforming a simple base distribution into a complex distribution via a set of invertible transformations.