no code implementations • 12 Jan 2024 • Lei Wang, Zihao Ren, Deming Yuan, Guodong Shi
We then employ such a compressed consensus flow as a fundamental consensus subroutine to develop distributed continuous-time and discrete-time solvers for network linear equations, and prove their exponential convergence properties under scalar node communications.
no code implementations • 29 Sep 2021 • Deming Yuan, Lei Wang, Alexandre Proutiere, Guodong Shi
Zeroth-order optimization has become increasingly important in complex optimization and machine learning when cost functions are impossible to be described in closed analytical forms.
no code implementations • 29 Sep 2021 • Lei Wang, Deming Yuan, Guodong Shi
In this paper, we study dataset processing mechanisms generated by linear queries in the presence of manifold data dependency.
no code implementations • 20 Dec 2019 • Deming Yuan, Alexandre Proutiere, Guodong Shi
When the loss functions are strongly convex, we establish improved regret and constraint violation upper bounds in $\mathcal{O}(\log(T))$ and $\mathcal{O}(\sqrt{T\log(T)})$.
no code implementations • 13 Feb 2019 • Deming Yuan, Alexandre Proutiere, Guodong Shi
We propose simple and natural distributed regression algorithms, involving, at each node and in each round, a local gradient descent step and a communication and averaging step where nodes aim at aligning their predictors to those of their neighbors.