no code implementations • 27 Mar 2020 • Naeimeh Omidvar, Mohammad Ali Maddah-Ali, Hamed Mahdavi
In this paper, we propose a method of distributed stochastic gradient descent (SGD), with low communication load and computational complexity, and still fast convergence.