Search Results for author: Marina Danilova

Found 8 papers, 3 papers with code

Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise

1 code implementation2 Jun 2022 Eduard Gorbunov, Marina Danilova, David Dobre, Pavel Dvurechensky, Alexander Gasnikov, Gauthier Gidel

In this work, we prove the first high-probability complexity results with logarithmic dependence on the confidence level for stochastic methods for solving monotone and structured non-monotone VIPs with non-sub-Gaussian (heavy-tailed) noise and unbounded domains.

Distributed Methods with Absolute Compression and Error Compensation

no code implementations4 Mar 2022 Marina Danilova, Eduard Gorbunov

Communication compression is a powerful approach to alleviating this issue, and, in particular, methods with biased compression and error compensation are extremely popular due to their practical efficiency.

Distributed Optimization

Near-Optimal High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise

1 code implementation10 Jun 2021 Eduard Gorbunov, Marina Danilova, Innokentiy Shibaev, Pavel Dvurechensky, Alexander Gasnikov

In our paper, we resolve this issue and derive the first high-probability convergence results with logarithmic dependence on the confidence level for non-smooth convex stochastic optimization problems with non-sub-Gaussian (heavy-tailed) noise.

Stochastic Optimization

Gradient Clipping Helps in Non-Smooth Stochastic Optimization with Heavy-Tailed Noise

no code implementations NeurIPS 2021 Eduard Gorbunov, Marina Danilova, Innokentiy Andreevich Shibaev, Pavel Dvurechensky, Alexander Gasnikov

In our paper, we resolve this issue and derive the first high-probability convergence results with logarithmical dependence on the confidence level for non-smooth convex stochastic optimization problems with non-sub-Gaussian (heavy-tailed) noise.

Stochastic Optimization

Recent Theoretical Advances in Non-Convex Optimization

no code implementations11 Dec 2020 Marina Danilova, Pavel Dvurechensky, Alexander Gasnikov, Eduard Gorbunov, Sergey Guminov, Dmitry Kamzolov, Innokentiy Shibaev

For this setting, we first present known results for the convergence rates of deterministic first-order methods, which are then followed by a general theoretical analysis of optimal stochastic and randomized gradient schemes, and an overview of the stochastic first-order methods.

Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping

1 code implementation NeurIPS 2020 Eduard Gorbunov, Marina Danilova, Alexander Gasnikov

In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise.

Stochastic Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.