Search Results for author: Benjamin Grimmer

Found 8 papers, 1 papers with code

Provably Faster Gradient Descent via Long Steps

1 code implementation12 Jul 2023 Benjamin Grimmer

This work establishes new convergence guarantees for gradient descent in smooth convex optimization via a computer-assisted analysis technique.

Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization

no code implementations27 May 2023 Benjamin Grimmer, Danlin Li

We consider (stochastic) subgradient methods for strongly convex but potentially nonsmooth non-Lipschitz optimization.

Gauges and Accelerated Optimization over Smooth and/or Strongly Convex Sets

no code implementations9 Mar 2023 Ning Liu, Benjamin Grimmer

We consider feasibility and constrained optimization problems defined over smooth and/or strongly convex sets.

Limiting Behaviors of Nonconvex-Nonconcave Minimax Optimization via Continuous-Time Systems

no code implementations20 Oct 2020 Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni

Unlike nonconvex optimization, where gradient descent is guaranteed to converge to a local optimizer, algorithms for nonconvex-nonconcave minimax optimization can have topologically different solution paths: sometimes converging to a solution, sometimes never converging and instead following a limit cycle, and sometimes diverging.

The Landscape of the Proximal Point Method for Nonconvex-Nonconcave Minimax Optimization

no code implementations15 Jun 2020 Benjamin Grimmer, Haihao Lu, Pratik Worah, Vahab Mirrokni

Critically, we show this envelope not only smooths the objective but can convexify and concavify it based on the level of interaction present between the minimizing and maximizing variables.

Bundle Method Sketching for Low Rank Semidefinite Programming

no code implementations11 Nov 2019 Lijun Ding, Benjamin Grimmer

In this paper, we show that the bundle method can be applied to solve semidefinite programming problems with a low rank solution without ever constructing a full matrix.

Convergence Rates for Deterministic and Stochastic Subgradient Methods Without Lipschitz Continuity

no code implementations12 Dec 2017 Benjamin Grimmer

We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions.

Proximally Guided Stochastic Subgradient Method for Nonsmooth, Nonconvex Problems

no code implementations12 Jul 2017 Damek Davis, Benjamin Grimmer

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i. e., uniformly prox-regular) nonsmooth, nonconvex functions---a wide class of functions which includes the additive and convex composite classes.

Cannot find the paper you are looking for? You can Submit a new open access paper.