no code implementations • 31 Mar 2024 • Itai Kreisler, Maor Ivgi, Oliver Hinder, Yair Carmon
We propose a method that achieves near-optimal rates for smooth stochastic convex optimization and requires essentially no prior knowledge of problem parameters.
no code implementations • 22 May 2023 • Itai Kreisler, Mor Shpigel Nacson, Daniel Soudry, Yair Carmon
Using this result, we characterize settings where GD provably converges to the EoS in scalar networks.