no code implementations • 11 Mar 2024 • Xufeng Cai, Jelena Diakonikolas
We further obtain generalizations of our results to weighted averaging of the iterates with increasing weights, which can be seen as interpolating between the last iterate and the average iterate guarantees.
no code implementations • 4 Oct 2023 • Xufeng Cai, Ahmet Alacaoglu, Jelena Diakonikolas
Our main contributions are variants of the classical Halpern iteration that employ variance reduction to obtain improved complexity guarantees in which $n$ component operators in the finite sum are ``on average'' either cocoercive or Lipschitz continuous and monotone, with parameter $L$.
no code implementations • 21 Jun 2023 • Xufeng Cai, Cheuk Yin Lin, Jelena Diakonikolas
Contrary to the empirical practice of sampling from the datasets without replacement and with (possible) reshuffling at each epoch, the theoretical counterpart of SGD usually relies on the assumption of sampling with replacement.
no code implementations • 9 Dec 2022 • Xufeng Cai, Chaobing Song, Stephen J. Wright, Jelena Diakonikolas
Our convergence analysis is based on a gradient Lipschitz condition with respect to a Mahalanobis norm, inspired by a recent progress on cyclic block coordinate methods.
1 code implementation • 17 Mar 2022 • Xufeng Cai, Chaobing Song, Cristóbal Guzmán, Jelena Diakonikolas
We study stochastic monotone inclusion problems, which widely appear in machine learning applications, including robust regression and adversarial learning.