Search Results for author: Roman Vershynin

Found 13 papers, 0 papers with code

Online Differentially Private Synthetic Data Generation

no code implementations12 Feb 2024 Yiyun He, Roman Vershynin, Yizhe Zhu

We present a polynomial-time algorithm for online differentially private synthetic data generation.

Synthetic Data Generation

An Algorithm for Streaming Differentially Private Data

no code implementations26 Jan 2024 Girish Kumar, Thomas Strohmer, Roman Vershynin

Much of the research in differential privacy has focused on offline applications with the assumption that all data is available at once.

Synthetic Data Generation

Differentially Private Low-dimensional Synthetic Data from High-dimensional Datasets

no code implementations26 May 2023 Yiyun He, Thomas Strohmer, Roman Vershynin, Yizhe Zhu

Differentially private synthetic data provide a powerful mechanism to enable data analysis while protecting sensitive information about individuals.

AVIDA: Alternating method for Visualizing and Integrating Data

no code implementations31 May 2022 Kathryn Dover, Zixuan Cang, Anna Ma, Qing Nie, Roman Vershynin

In general applications, other methods can be used for the alignment and dimension reduction modules.

Dimensionality Reduction

The Quarks of Attention

no code implementations15 Feb 2022 Pierre Baldi, Roman Vershynin

The gating mechanisms correspond to multiplicative extensions of the standard model and are used across all current attention-based deep learning architectures.

A theory of capacity and sparse neural encoding

no code implementations19 Feb 2021 Pierre Baldi, Roman Vershynin

Motivated by biological considerations, we study sparse neural maps from an input layer to a target layer with sparse activity, and specifically the problem of storing $K$ input-target associations $(x, y)$, or memories, when the target vectors $y$ are sparse.

Memory capacity of neural networks with threshold and ReLU activations

no code implementations20 Jan 2020 Roman Vershynin

Overwhelming theoretical and empirical evidence shows that mildly overparametrized neural networks -- those with more connections than the size of the training data -- are often able to memorize the training data with $100\%$ accuracy.

Open-Ended Question Answering

Online Stochastic Gradient Descent with Arbitrary Initialization Solves Non-smooth, Non-convex Phase Retrieval

no code implementations28 Oct 2019 Yan Shuo Tan, Roman Vershynin

In recent literature, a general two step procedure has been formulated for solving the problem of phase retrieval.

Retrieval

The capacity of feedforward neural networks

no code implementations2 Jan 2019 Pierre Baldi, Roman Vershynin

Here we define the capacity of an architecture by the binary logarithm of the number of functions it can compute, as the synaptic weights are varied.

On Neuronal Capacity

no code implementations NeurIPS 2018 Pierre Baldi, Roman Vershynin

We define the capacity of a learning machine to be the logarithm of the number (or volume) of the functions it can implement.

Phase Retrieval via Randomized Kaczmarz: Theoretical Guarantees

no code implementations30 Jun 2017 Yan Shuo Tan, Roman Vershynin

We consider the problem of phase retrieval, i. e. that of solving systems of quadratic equations.

Retrieval

Polynomial Time and Sample Complexity for Non-Gaussian Component Analysis: Spectral Methods

no code implementations4 Apr 2017 Yan Shuo Tan, Roman Vershynin

The problem of Non-Gaussian Component Analysis (NGCA) is about finding a maximal low-dimensional subspace $E$ in $\mathbb{R}^n$ so that data points projected onto $E$ follow a non-gaussian distribution.

Optimization via Low-rank Approximation for Community Detection in Networks

no code implementations31 May 2014 Can M. Le, Elizaveta Levina, Roman Vershynin

Community detection is one of the fundamental problems of network analysis, for which a number of methods have been proposed.

Community Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.