Normalization

In-Place Activated Batch Normalization

Introduced by Bulò et al. in In-Place Activated BatchNorm for Memory-Optimized Training of DNNs

In-Place Activated Batch Normalization, or InPlace-ABN, substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks. It approximately halves the memory requirements during training of modern deep learning models.

Source: In-Place Activated BatchNorm for Memory-Optimized Training of DNNs

Papers


Paper Code Results Date Stars

Categories