no code implementations • 5 Sep 2023 • Nikita Doikov, Geovani Nunes Grapiglia
In this work, we develop first-order (Hessian-free) and zero-order (derivative-free) implementations of the Cubically regularized Newton method for solving general non-convex optimization problems.
no code implementations • 28 Aug 2023 • Nikita Doikov
In this paper, we show that for minimizing Quasi-Self-Concordant functions we can use instead the basic Newton Method with Gradient Regularization.
no code implementations • 30 May 2023 • Anastasia Koloskova, Nikita Doikov, Sebastian U. Stich, Martin Jaggi
In machine learning and neural network optimization, algorithms like incremental gradient, and shuffle SGD are popular due to minimizing the number of cache misses and good practical convergence behavior.
no code implementations • 24 Feb 2023 • Maria-Luiza Vladarean, Nikita Doikov, Martin Jaggi, Nicolas Flammarion
This paper studies first-order algorithms for solving fully composite optimization problems over convex and compact sets.
no code implementations • 23 Feb 2023 • El Mahdi Chayti, Nikita Doikov, Martin Jaggi
Our helper framework offers the algorithm designer high flexibility for constructing and analyzing the stochastic Cubic Newton methods, allowing arbitrary size batches, and the use of noisy and possibly biased estimates of the gradients and Hessians, incorporating both the variance reduction and the lazy Hessian updates.
no code implementations • 30 Jan 2023 • Nikita Doikov, Anton Rodomanov
We study first-order methods with preconditioning for solving structured nonlinear convex optimization problems.
no code implementations • 1 Dec 2022 • Nikita Doikov, El Mahdi Chayti, Martin Jaggi
This provably improves the total arithmetical complexity of second-order algorithms by a factor $\sqrt{d}$.
no code implementations • 11 Aug 2022 • Nikita Doikov, Konstantin Mishchenko, Yurii Nesterov
We analyze the performance of a variant of Newton method with quadratic regularization for solving composite convex minimization problems.
no code implementations • ICML 2020 • Filip Hanzely, Nikita Doikov, Peter Richtárik, Yurii Nesterov
In this paper, we propose a new randomized second-order optimization algorithm---Stochastic Subspace Cubic Newton (SSCN)---for minimizing a high dimensional convex function $f$.
1 code implementation • ICML 2020 • Nikita Doikov, Yurii Nesterov
In this paper, we study inexact high-order Tensor Methods for solving convex optimization problems with composite objective.
Optimization and Control
no code implementations • ICML 2018 • Nikita Doikov, Peter Richtarik, University Edinburgh
To this effect we propose and analyze a randomized block cubic Newton (RBCN) method, which in each iteration builds a model of the objective function formed as the sum of the natural models of its three components: a linear model with a quadratic regularizer for the differentiable term, a quadratic model with a cubic regularizer for the twice differentiable term, and perfect (proximal) model for the nonsmooth term.