To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the automated search for efficient neural architectures, a process dubbed Neural Architecture Search (NAS). Although very appealing, this framework is not without drawbacks and several works have started to question its capabilities on small hand-crafted benchmarks... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Random Search
Hyperparameter Search
Sigmoid Activation
Activation Functions
Tanh Activation
Activation Functions
Softmax
Output Functions
LSTM
Recurrent Neural Networks