1 code implementation • 12 Jul 2023 • Benjamin Grimmer
This work establishes new convergence guarantees for gradient descent in smooth convex optimization via a computer-assisted analysis technique.
no code implementations • 27 May 2023 • Benjamin Grimmer, Danlin Li
We consider (stochastic) subgradient methods for strongly convex but potentially nonsmooth non-Lipschitz optimization.
no code implementations • 9 Mar 2023 • Ning Liu, Benjamin Grimmer
We consider feasibility and constrained optimization problems defined over smooth and/or strongly convex sets.
no code implementations • 20 Oct 2020 • Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni
Unlike nonconvex optimization, where gradient descent is guaranteed to converge to a local optimizer, algorithms for nonconvex-nonconcave minimax optimization can have topologically different solution paths: sometimes converging to a solution, sometimes never converging and instead following a limit cycle, and sometimes diverging.
no code implementations • 15 Jun 2020 • Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni
Critically, we show this envelope not only smooths the objective but can convexify and concavify it based on the level of interaction present between the minimizing and maximizing variables.
no code implementations • 11 Nov 2019 • Lijun Ding, Benjamin Grimmer
In this paper, we show that the bundle method can be applied to solve semidefinite programming problems with a low rank solution without ever constructing a full matrix.
no code implementations • 12 Dec 2017 • Benjamin Grimmer
We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions.
no code implementations • 12 Jul 2017 • Damek Davis, Benjamin Grimmer
In this paper, we introduce a stochastic projected subgradient method for weakly convex (i. e., uniformly prox-regular) nonsmooth, nonconvex functions---a wide class of functions which includes the additive and convex composite classes.