no code implementations • 27 Mar 2020 • Naeimeh Omidvar, Mohammad Ali Maddah-Ali, Hamed Mahdavi
In this paper, we propose a method of distributed stochastic gradient descent (SGD), with low communication load and computational complexity, and still fast convergence.
no code implementations • 5 Oct 2019 • Ahana Ghosh, Sebastian Tschiatschek, Hamed Mahdavi, Adish Singla
In the test phase, the AI agent has to interact with a user of unknown type.