no code implementations • ICML Workshop AutoML 2021 • Julien Niklas Siems, Aaron Klein, Cedric Archambeau, Maren Mahsereci
Dynamic sparsity pruning undoes this limitation and allows to adapt the structure of the sparse neural network during training.
1 code implementation • 1 Jan 2021 • Michael Ruchte, Arber Zela, Julien Niklas Siems, Josif Grabocka, Frank Hutter
Neural Architecture Search (NAS) is one of the focal points for the Deep Learning community, but reproducing NAS methods is extremely challenging due to numerous low-level implementation details.