Search Results for author: Dmytro Perekrestenko

Found 7 papers, 1 papers with code

Constructive universal distribution generation through deep ReLU networks

no code implementations ICML 2020 Dmytro Perekrestenko, Stephan Müller, Helmut Bölcskei

We present an explicit deep network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional target distribution of finite differential entropy and Lipschitz-continuous pdf.

High-Dimensional Distribution Generation Through Deep Neural Networks

no code implementations26 Jul 2021 Dmytro Perekrestenko, Léandre Eberhard, Helmut Bölcskei

We show that every $d$-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a $1$-dimensional uniform input distribution.

Quantization Vocal Bursts Intensity Prediction

Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks

no code implementations30 Jun 2020 Dmytro Perekrestenko, Stephan Müller, Helmut Bölcskei

We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution.

Vocal Bursts Intensity Prediction

Deep Neural Network Approximation Theory

no code implementations8 Jan 2019 Dennis Elbrächter, Dmytro Perekrestenko, Philipp Grohs, Helmut Bölcskei

This paper develops fundamental limits of deep neural network learning by characterizing what is possible if no constraints are imposed on the learning algorithm and on the amount of training data.

Handwritten Digit Recognition Image Classification +1

The universal approximation power of finite-width deep ReLU networks

no code implementations ICLR 2019 Dmytro Perekrestenko, Philipp Grohs, Dennis Elbrächter, Helmut Bölcskei

We show that finite-width deep ReLU neural networks yield rate-distortion optimal approximation (B\"olcskei et al., 2018) of polynomials, windowed sinusoidal functions, one-dimensional oscillatory textures, and the Weierstrass function, a fractal function which is continuous but nowhere differentiable.

Convolutional Recurrent Neural Networks for Electrocardiogram Classification

1 code implementation17 Oct 2017 Martin Zihlmann, Dmytro Perekrestenko, Michael Tschannen

We propose two deep neural network architectures for classification of arbitrary-length electrocardiogram (ECG) recordings and evaluate them on the atrial fibrillation (AF) classification data set provided by the PhysioNet/CinC Challenge 2017.

Classification Data Augmentation +1

Faster Coordinate Descent via Adaptive Importance Sampling

no code implementations7 Mar 2017 Dmytro Perekrestenko, Volkan Cevher, Martin Jaggi

Coordinate descent methods employ random partial updates of decision variables in order to solve huge-scale convex optimization problems.

Cannot find the paper you are looking for? You can Submit a new open access paper.