Search Results for author: Paul Hagemann

Found 13 papers, 8 papers with code

Conditional Wasserstein Distances with Applications in Bayesian OT Flow Matching

no code implementations27 Mar 2024 Jannis Chemseddine, Paul Hagemann, Christian Wald, Gabriele Steidl

In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.

Conditional Image Generation

Mixed Noise and Posterior Estimation with Conditional DeepGEM

1 code implementation5 Feb 2024 Paul Hagemann, Johannes Hertrich, Maren Casfor, Sebastian Heidenreich, Gabriele Steidl

Motivated by indirect measurements and applications from nanometrology with a mixed noise model, we develop a novel algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems.

Learning from small data sets: Patch-based regularizers in inverse problems for image reconstruction

no code implementations27 Dec 2023 Moritz Piening, Fabian Altekrüger, Johannes Hertrich, Paul Hagemann, Andrea Walther, Gabriele Steidl

The solution of inverse problems is of fundamental interest in medical and astronomical imaging, geophysics as well as engineering and life sciences.

Geophysics Image Reconstruction +2

Y-Diagonal Couplings: Approximating Posteriors with Conditional Wasserstein Distances

no code implementations20 Oct 2023 Jannis Chemseddine, Paul Hagemann, Christian Wald

In inverse problems, many conditional generative models approximate the posterior measure by minimizing a distance between the joint measure and its learned approximation.

Posterior Sampling Based on Gradient Flows of the MMD with Negative Distance Kernel

1 code implementation4 Oct 2023 Paul Hagemann, Johannes Hertrich, Fabian Altekrüger, Robert Beinert, Jannis Chemseddine, Gabriele Steidl

We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modeling.

Conditional Image Generation

Generative Sliced MMD Flows with Riesz Kernels

1 code implementation19 May 2023 Johannes Hertrich, Christian Wald, Fabian Altekrüger, Paul Hagemann

We prove that the MMD of Riesz kernels, which is also known as energy distance, coincides with the MMD of their sliced version.

Image Generation

Conditional Generative Models are Provably Robust: Pointwise Guarantees for Bayesian Inverse Problems

no code implementations28 Mar 2023 Fabian Altekrüger, Paul Hagemann, Gabriele Steidl

Conditional generative models became a very powerful tool to sample from Bayesian inverse problem posteriors.

Multilevel Diffusion: Infinite Dimensional Score-Based Diffusion Models for Image Generation

1 code implementation8 Mar 2023 Paul Hagemann, Sophie Mildenberger, Lars Ruthotto, Gabriele Steidl, Nicole Tianjiao Yang

We thereby intend to obtain diffusion models that generalize across different resolution levels and improve the efficiency of the training process.

Image Generation

PatchNR: Learning from Very Few Images by Patch Normalizing Flow Regularization

1 code implementation24 May 2022 Fabian Altekrüger, Alexander Denker, Paul Hagemann, Johannes Hertrich, Peter Maass, Gabriele Steidl

Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications.

Computed Tomography (CT)

Generalized Normalizing Flows via Markov Chains

1 code implementation24 Nov 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models.

Stochastic Normalizing Flows for Inverse Problems: a Markov Chains Viewpoint

1 code implementation23 Sep 2021 Paul Hagemann, Johannes Hertrich, Gabriele Steidl

To overcome topological constraints and improve the expressiveness of normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic normalizing flows which combine deterministic, learnable flow transformations with stochastic sampling methods.

Invertible Neural Networks versus MCMC for Posterior Reconstruction in Grazing Incidence X-Ray Fluorescence

no code implementations5 Feb 2021 Anna Andrle, Nando Farchmin, Paul Hagemann, Sebastian Heidenreich, Victor Soltwisch, Gabriele Steidl

Grazing incidence X-ray fluorescence is a non-destructive technique for analyzing the geometry and compositional parameters of nanostructures appearing e. g. in computer chips.

Stabilizing Invertible Neural Networks Using Mixture Models

1 code implementation7 Sep 2020 Paul Hagemann, Sebastian Neumayer

In this paper, we analyze the properties of invertible neural networks, which provide a way of solving inverse problems.

Cannot find the paper you are looking for? You can Submit a new open access paper.