BARS: Joint Search of Cell Topology and Layout for Accurate and Efficient Binary ARchitectures

Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency. Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs. This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space. Specifically, we analyze the information bottlenecks that are related to both the topology and layout architecture design choices. And we propose to automatically search for the optimal information flow. To achieve that, we design a two-level (Macro & Micro) search space tailored for BNNs and apply a differentiable neural architecture search (NAS) to explore this search space efficiently. The macro-level search space includes width and depth decisions, which is required for better balancing the model performance and complexity. We also design the micro-level search space to strengthen the information flow for BNN. %A notable challenge of BNN architecture search lies in that binary operations exacerbate the "collapse" problem of differentiable NAS, for which we incorporate various search and derive strategies to stabilize the search process. On CIFAR-10, BARS achieves 1.5% higher accuracy with 2/3 binary operations and 1/10 floating-point operations comparing with existing BNN NAS studies. On ImageNet, with similar resource consumption, BARS-discovered architecture achieves a 6% accuracy gain than hand-crafted binary ResNet-18 architectures and outperforms other binary architectures while fully binarizing the architecture backbone.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here