Search Results for author: Jakob Zech

Found 5 papers, 0 papers with code

Distribution learning via neural differential equations: a nonparametric statistical perspective

no code implementations3 Sep 2023 Youssef Marzouk, Zhi Ren, Sven Wang, Jakob Zech

Ordinary differential equations (ODEs), via their induced flow maps, provide a powerful framework to parameterize invertible transformations for the purpose of representing complex probability distributions.

Density Estimation

Deep Operator Network Approximation Rates for Lipschitz Operators

no code implementations19 Jul 2023 Christoph Schwab, Andreas Stein, Jakob Zech

We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or H\"older) continuous maps $\mathcal G:\mathcal X\to\mathcal Y$ between (subsets of) separable Hilbert spaces $\mathcal X$, $\mathcal Y$.

Neural and spectral operator surrogates: unified construction and expression rate bounds

no code implementations11 Jul 2022 Lukas Herrmann, Christoph Schwab, Jakob Zech

Specifically, we study approximation rates for Deep Neural Operator and Generalized Polynomial Chaos (gpc) Operator surrogates for nonlinear, holomorphic maps between infinite-dimensional, separable Hilbert spaces.

De Rham compatible Deep Neural Network FEM

no code implementations14 Jan 2022 Marcello Longo, Joost A. A. Opschoor, Nico Disch, Christoph Schwab, Jakob Zech

Our construction and DNN architecture generalizes previous results in that no geometric restrictions on the regular simplicial partitions $\mathcal{T}$ of $\Omega$ are required for DNN emulation.

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d,γ_d)$

no code implementations13 Nov 2021 Christoph Schwab, Jakob Zech

For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}^d\to\mathbb{R}$ in the norm of $L^2(\mathbb{R}^d,\gamma_d)$ where $d\in {\mathbb{N}}\cup\{ \infty \}$.

Cannot find the paper you are looking for? You can Submit a new open access paper.