NASLib: A Modular and Flexible Neural Architecture Search Library

1 Jan 2021  ·  Michael Ruchte, Arber Zela, Julien Niklas Siems, Josif Grabocka, Frank Hutter ·

Neural Architecture Search (NAS) is one of the focal points for the Deep Learning community, but reproducing NAS methods is extremely challenging due to numerous low-level implementation details. To alleviate this problem we introduce NASLib, a NAS library built upon PyTorch. This framework offers high-level abstractions for designing and reusing search spaces, interfaces to benchmarks and evaluation pipelines, enabling the implementation and extension of state-of-the-art NAS methods with a few lines of code. The modularized nature of NASlib allows researchers to easily innovate on individual components (e.g., define a new search space while reusing an optimizer and evaluation pipeline, or propose a new optimizer with existing search spaces). As a result, NASLib has the potential to facilitate NAS research by allowing fast advances and evaluations that are by design free of confounding factors. To demonstrate that NASLib is a sound library, we implement and achieve state-of-the-art results with one-shot NAS optimizers (DARTS and GDAS) over the DARTS search space and the popular NAS-Bench-201 benchmark. Last but not least, we showcase how easily novel approaches are coded in NASLib, by training DARTS on a hierarchical search space.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods