Search Results for author: Tillmann Miltzow

Found 5 papers, 0 papers with code

Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

no code implementations NeurIPS 2023 Daniel Bertschinger, Christoph Hertrich, Paul Jungeblut, Tillmann Miltzow, Simon Weber

We consider the problem of finding weights and biases for a two-layer fully connected neural network to fit a given set of data points as well as possible, also known as EmpiricalRiskMinimization.

On Classifying Continuous Constraint Satisfaction Problems

no code implementations4 Jun 2021 Tillmann Miltzow, Reinier F. Schmiermann

In an instance of this problem we are given some sentence of the form $\exists x_1, \ldots, x_n \in \mathbb{R} : \Phi(x_1, \ldots, x_n)$, where $\Phi$ is a well-formed quantifier-free formula consisting of the symbols $\{0, 1, +, \cdot, \geq, >, \wedge, \vee, \neg\}$, the goal is to check whether this sentence is true.

Sentence

Training Neural Networks is ER-complete

no code implementations NeurIPS 2021 Mikkel Abrahamsen, Linda Kleist, Tillmann Miltzow

Given a neural network, training data, and a threshold, finding weights for the neural network such that the total error is below the threshold is known to be NP-hard.

Training Neural Networks is $\exists\mathbb R$-complete

no code implementations NeurIPS 2021 Mikkel Abrahamsen, Linda Kleist, Tillmann Miltzow

Given a neural network, training data, and a threshold, it was known that it is NP-hard to find weights for the neural network such that the total error is below the threshold.

On the VC-dimension of half-spaces with respect to convex sets

no code implementations2 Jul 2019 Nicolas Grelier, Saeed Gh. Ilchi, Tillmann Miltzow, Shakhar Smorodinsky

A family S of convex sets in the plane defines a hypergraph H = (S, E) as follows.

Computational Geometry Discrete Mathematics Data Structures and Algorithms Combinatorics

Cannot find the paper you are looking for? You can Submit a new open access paper.