Search Results for author: Dewei Zhang

Found 2 papers, 0 papers with code

Achieving Linear Speedup in Non-IID Federated Bilevel Learning

no code implementations10 Feb 2023 Minhui Huang, Dewei Zhang, Kaiyi Ji

However, several important properties in federated learning such as the partial client participation and the linear speedup for convergence (i. e., the convergence rate and complexity are improved linearly with respect to the number of sampled clients) in the presence of non-i. i. d.~datasets, still remain open.

Bilevel Optimization Federated Learning

Riemannian Stochastic Gradient Method for Nested Composition Optimization

no code implementations19 Jul 2022 Dewei Zhang, Sam Davanloo Tajbakhsh

For two-level composition optimization, we present a Riemannian Stochastic Composition Gradient Descent (R-SCGD) method that finds an approximate stationary point, with expected squared Riemannian gradient smaller than $\epsilon$, in $O(\epsilon^{-2})$ calls to the stochastic gradient oracle of the outer function and stochastic function and gradient oracles of the inner function.

Meta-Learning reinforcement-learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.