Interlocking Backpropagation: Improving depthwise model-parallelism

8 Oct 2020 Aidan N. Gomez Oscar Key Stephen Gou Nick Frosst Jeff Dean Yarin Gal

The number of parameters in state of the art neural networks has drastically increased in recent years. This surge of interest in large scale neural networks has motivated the development of new distributed training strategies enabling such models... (read more)

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper