no code implementations • 5 Dec 2023 • Tjeerd Jan Heeringa, Tim Roith, Christoph Brune, Martin Burger
This paper presents a method for finding a sparse representation of Barron functions.
1 code implementation • 2 Apr 2023 • Samira Kabri, Tim Roith, Daniel Tenbrinck, Martin Burger
In this paper we investigate the use of Fourier Neural Operators (FNOs) for image classification in comparison to standard Convolutional Neural Networks (CNNs).
1 code implementation • 24 Nov 2021 • Leon Bungert, Jeff Calder, Tim Roith
In this work we prove uniform convergence rates for solutions of the graph infinity Laplace equation as the number of vertices grows to infinity.
1 code implementation • 4 Jun 2021 • Leon Bungert, Tim Roith, Daniel Tenbrinck, Martin Burger
We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations.
1 code implementation • 10 May 2021 • Leon Bungert, Tim Roith, Daniel Tenbrinck, Martin Burger
In contrast to established methods for sparse training the proposed family of algorithms constitutes a regrowth strategy for neural networks that is solely optimization-based without additional heuristics.
Ranked #164 on Image Classification on CIFAR-10
1 code implementation • 23 Mar 2021 • Leon Bungert, René Raab, Tim Roith, Leo Schwinn, Daniel Tenbrinck
Despite the large success of deep neural networks (DNN) in recent years, most neural networks still lack mathematical guarantees in terms of stability.
no code implementations • 7 Dec 2020 • Tim Roith, Leon Bungert
In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove $\Gamma$-convergence in the $L^\infty$-topology to the supremum norm of the gradient as the graph becomes denser.