Extended Batch Normalization

12 Mar 2020  ·  Chunjie Luo, Jianfeng Zhan, Lei Wang, Wanling Gao ·

Batch normalization (BN) has become a standard technique for training the modern deep networks. However, its effectiveness diminishes when the batch size becomes smaller, since the batch statistics estimation becomes inaccurate. That hinders batch normalization's usage for 1) training larger model which requires small batches constrained by memory consumption, 2) training on mobile or embedded devices of which the memory resource is limited. In this paper, we propose a simple but effective method, called extended batch normalization (EBN). For NCHW format feature maps, extended batch normalization computes the mean along the (N, H, W) dimensions, as the same as batch normalization, to maintain the advantage of batch normalization. To alleviate the problem caused by small batch size, extended batch normalization computes the standard deviation along the (N, C, H, W) dimensions, thus enlarges the number of samples from which the standard deviation is computed. We compare extended batch normalization with batch normalization and group normalization on the datasets of MNIST, CIFAR-10/100, STL-10, and ImageNet, respectively. The experiments show that extended batch normalization alleviates the problem of batch normalization with small batch size while achieving close performances to batch normalization with large batch size.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification STL-10 ResNet18(EBN, 4) Percentage correct 76.49 # 74
Image Classification STL-10 ResNet18(GN, 4) Percentage correct 79.3 # 64
Image Classification STL-10 ResNet18(BN, 4) Percentage correct 81.04 # 62
Image Classification STL-10 ResNet18(GN, 128) Percentage correct 72.66 # 85
Image Classification STL-10 ResNet18(EBN, 128) Percentage correct 75.57 # 76
Image Classification STL-10 ResNet18(BN, 128) Percentage correct 78.65 # 66

Methods