1 code implementation • NeurIPS 2021 • Hang Xu, Kelly Kostopoulou, Aritra Dutta, Xin Li, Alexandros Ntoulas, Panos Kalnis
DeepReduce is orthogonal to existing gradient sparsifiers and can be applied in conjunction with them, transparently to the end-user, to significantly lower the communication overhead.
1 code implementation • NeurIPS 2021 • Kelly Kostopoulou, Hang Xu, Aritra Dutta, Xin Li, Alexandros Ntoulas, Panos Kalnis
This paper introduces DeepReduce, a versatile framework for the compressed communication of sparse tensors, tailored for distributed deep learning.
no code implementations • 13 Jul 2013 • Alexandros Ntoulas, Omar Alonso, Vasilis Kandylas
As the number of applications that use machine learning algorithms increases, the need for labeled data useful for training such algorithms intensifies.