To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

11 Feb 2020  ·  Aloïs Pourchot, Alexis Ducarouge, Olivier Sigaud ·

Weight-sharing (WS) has recently emerged as a paradigm to accelerate the automated search for efficient neural architectures, a process dubbed Neural Architecture Search (NAS). Although very appealing, this framework is not without drawbacks and several works have started to question its capabilities on small hand-crafted benchmarks. In this paper, we take advantage of the \nasbench dataset to challenge the efficiency of WS on a representative search space. By comparing a SOTA WS approach to a plain random search we show that, despite decent correlations between evaluations using weight-sharing and standalone ones, WS is only rarely significantly helpful to NAS. In particular we highlight the impact of the search space itself on the benefits.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods