no code implementations • 28 Feb 2024 • Laura Manduchi, Kushagra Pandey, Robert Bamler, Ryan Cotterell, Sina Däubener, Sophie Fellenz, Asja Fischer, Thomas Gärtner, Matthias Kirchler, Marius Kloft, Yingzhen Li, Christoph Lippert, Gerard de Melo, Eric Nalisnick, Björn Ommer, Rajesh Ranganath, Maja Rudolph, Karen Ullrich, Guy Van Den Broeck, Julia E Vogt, Yixin Wang, Florian Wenzel, Frank Wood, Stephan Mandt, Vincent Fortuin
The field of deep generative modeling has grown rapidly and consistently over the years.
no code implementations • 25 Oct 2023 • Eshant English, Matthias Kirchler, Christoph Lippert
Normalising flows are statistical models that transform a complex density into a simpler density through the use of bijective transformations enabling both density estimation and data generation from a single model.
no code implementations • 2 Aug 2023 • Masoumeh Javanbakhat, Christoph Lippert
Within this framework, we place a prior over the parameters of a self-supervised learning model and use cSGHMC to approximate the high dimensional and multimodal posterior distribution over the embeddings.
no code implementations • 27 Jul 2023 • Eshant English, Matthias Kirchler, Christoph Lippert
Normalising Flows are non-parametric statistical models characterised by their dual capabilities of density estimation and generation.
1 code implementation • 27 Jun 2023 • Alexander Rakowski, Christoph Lippert
Canonical Correlation Analysis (CCA)-based methods have traditionally been used to identify shared variables, however, they were designed for multivariate targets and only offer trivial solutions for univariate cases.
1 code implementation • 24 Oct 2022 • Benjamin Bergner, Christoph Lippert, Aravindh Mahendran
We propose a simple method, Iterative Patch Selection (IPS), which decouples the memory usage from the input size and thus enables the processing of arbitrarily large images under tight hardware constraints.
1 code implementation • 29 Sep 2022 • Matthias Kirchler, Christoph Lippert, Marius Kloft
Normalizing flows are powerful non-parametric statistical models that function as a hybrid between density estimators and generative models.
1 code implementation • 2 Jul 2022 • Josafat-Mattias Burmeister, Marcel Fernandez Rosas, Johannes Hagemann, Jonas Kordt, Jasper Blum, Simon Shabo, Benjamin Bergner, Christoph Lippert
Since labeling medical image data is a costly and labor-intensive process, active learning has gained much popularity in the medical image segmentation domain in recent years.
1 code implementation • 17 Dec 2021 • Benjamin Bergner, Csaba Rohrer, Aiham Taleb, Martha Duchrau, Guilherme De Leon, Jonas Almeida Rodrigues, Falk Schwendicke, Joachim Krois, Christoph Lippert
We propose a simple and efficient image classification architecture based on deep multiple instance learning, and apply it to the challenging task of caries detection in dental radiographs.
1 code implementation • CVPR 2022 • Aiham Taleb, Matthias Kirchler, Remo Monti, Christoph Lippert
High annotation costs are a substantial bottleneck in applying modern deep learning architectures to clinically relevant medical use cases, substantiating the need for novel algorithms to learn from unlabeled data.
no code implementations • 29 Sep 2021 • Yamen Ali, Aiham Taleb, Marina M. -C. Höhne, Christoph Lippert
Self-supervised learning methods can be used to learn meaningful representations from unlabeled data that can be transferred to supervised downstream tasks to reduce the need for labeled data.
2 code implementations • 16 Sep 2021 • Matthias Kirchler, Martin Graf, Marius Kloft, Christoph Lippert
When explaining the decisions of deep neural networks, simple stories are tempting but dangerous.
no code implementations • NeurIPS 2021 • Benjamin Bergner, Christoph Lippert
Deep neural networks are prone to overfitting, especially on small datasets.
1 code implementation • NeurIPS 2020 • Aiham Taleb, Winfried Loetzsch, Noel Danz, Julius Severin, Thomas Gaertner, Benjamin Bergner, Christoph Lippert
Self-supervised learning methods have witnessed a recent surge of interest after proving successful in multiple application fields.
1 code implementation • NeurIPS 2020 • Jakob Lindinger, David Reeb, Christoph Lippert, Barbara Rakitsch
Deep Gaussian Processes learn probabilistic data representations for supervised learning by cascading multiple Gaussian Processes.
no code implementations • 11 Dec 2019 • Aiham Taleb, Christoph Lippert, Tassilo Klein, Moin Nabi
We introduce the multimodal puzzle task, which facilitates rich representation learning from multiple image modalities.
1 code implementation • 14 Oct 2019 • Matthias Kirchler, Shahryar Khorasani, Marius Kloft, Christoph Lippert
We propose a two-sample testing procedure based on learned deep neural network representations.
no code implementations • 2 Dec 2018 • Stefan Konigorski, Shahryar Khorasani, Christoph Lippert
The results indicate that CNNs provide a fast, scalable and precise tool to derive quantitative AD traits and that new kernels integrating domain knowledge can yield higher power in association tests of very rare variants.
no code implementations • 16 Jul 2015 • Stephan Mandt, Florian Wenzel, Shinichi Nakajima, John P. Cunningham, Christoph Lippert, Marius Kloft
Formulated as models for linear regression, LMMs have been restricted to continuous phenotypes.
no code implementations • NeurIPS 2013 • Barbara Rakitsch, Christoph Lippert, Karsten Borgwardt, Oliver Stegle
Multi-task prediction models are widely being used to couple regressors or classification models by sharing information across related tasks.
no code implementations • 3 May 2012 • Jennifer Listgarten, Christoph Lippert, Eun Yong Kang, Jing Xiang, Carl M. Kadie, David Heckerman
Until now, these approaches did not address confounding by family relatedness and population structure, a problem that is becoming more important as larger data sets are used to increase power.