Convolutional Neural Networks

Big-Little Net is a convolutional neural network architecture for learning multi-scale feature representations. This is achieved by using a multi-branch network, which has different computational complexity at different branches with different resolutions. Through frequent merging of features from branches at distinct scales, the model obtains multi-scale features while using less computation.

It consists of Big-Little Modules, which have two branches: each of which represents a separate block from a deep model and a less deep counterpart. The two branches are fused with linear combination + unit weights. These two branches are known as Big-Branch (more layers and channels at low resolutions) and Little-Branch (fewer layers and channels at high resolution).

Source: Big-Little Net: An Efficient Multi-Scale Feature Representation for Visual and Speech Recognition

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 1 33.33%
Object Recognition 1 33.33%
Speech Recognition 1 33.33%

Categories