Learnable Embedding Space for Efficient Neural Architecture Compression

We propose a method to incrementally learn an embedding space over the domain of network architectures, to enable the careful selection of architectures for evaluation during compressed architecture search. Given a teacher network, we search for a compressed network architecture by using Bayesian Optimization (BO) with a kernel function defined over our proposed embedding space to select architectures for evaluation... (read more)

PDF Abstract ICLR 2019 PDF ICLR 2019 Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Random Search
Hyperparameter Search
1x1 Convolution
Convolutions
Grouped Convolution
Convolutions
ReLU
Activation Functions
Batch Normalization
Normalization
Depthwise Convolution
Convolutions
Pointwise Convolution
Convolutions
Residual Connection
Skip Connections
Convolution
Convolutions
Average Pooling
Pooling Operations
Channel Shuffle
Miscellaneous Components
Groupwise Point Convolution
Convolutions
ShuffleNet Block
Image Model Blocks
Global Average Pooling
Pooling Operations
Dense Connections
Feedforward Networks
Max Pooling
Pooling Operations
Softmax
Output Functions
ShuffleNet
Convolutional Neural Networks