Search Results for author: Aixiang

Found 1 papers, 0 papers with code

MSTGD:A Memory Stochastic sTratified Gradient Descent Method with an Exponential Convergence Rate

no code implementations21 Feb 2022 Aixiang, Chen, Jinting Zhang, Zanbo Zhang, Zhihong Li

The fluctuation effect of gradient expectation and variance caused by parameter update between consecutive iterations is neglected or confusing by current mainstream gradient optimization algorithms. Using this fluctuation effect, combined with the stratified sampling strategy, this paper designs a novel \underline{M}emory \underline{S}tochastic s\underline{T}ratified Gradient Descend(\underline{MST}GD) algorithm with an exponential convergence rate.

Cannot find the paper you are looking for? You can Submit a new open access paper.