Search Results for author: Tim Roith

Found 7 papers, 5 papers with code

Resolution-Invariant Image Classification based on Fourier Neural Operators

1 code implementation2 Apr 2023 Samira Kabri, Tim Roith, Daniel Tenbrinck, Martin Burger

In this paper we investigate the use of Fourier Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs).

Classification Image Classification

Uniform Convergence Rates for Lipschitz Learning on Graphs

1 code implementation24 Nov 2021 Leon Bungert, Jeff Calder, Tim Roith

In this work we prove uniform convergence rates for solutions of the graph infinity Laplace equation as the number of vertices grows to infinity.

Neural Architecture Search via Bregman Iterations

1 code implementation4 Jun 2021 Leon Bungert, Tim Roith, Daniel Tenbrinck, Martin Burger

We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations.

Deblurring Denoising +1

A Bregman Learning Framework for Sparse Neural Networks

1 code implementation10 May 2021 Leon Bungert, Tim Roith, Daniel Tenbrinck, Martin Burger

In contrast to established methods for sparse training the proposed family of algorithms constitutes a regrowth strategy for neural networks that is solely optimization-based without additional heuristics.

Denoising Image Classification

CLIP: Cheap Lipschitz Training of Neural Networks

1 code implementation23 Mar 2021 Leon Bungert, René Raab, Tim Roith, Leo Schwinn, Daniel Tenbrinck

Despite the large success of deep neural networks (DNN) in recent years, most neural networks still lack mathematical guarantees in terms of stability.

Continuum Limit of Lipschitz Learning on Graphs

no code implementations7 Dec 2020 Tim Roith, Leon Bungert

In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove $\Gamma$-convergence in the $L^\infty$-topology to the supremum norm of the gradient as the graph becomes denser.

Cannot find the paper you are looking for? You can Submit a new open access paper.