Normalization

Group Normalization

Introduced by Wu et al. in Group Normalization

Group Normalization is a normalization layer that divides channels into groups and normalizes the features within each group. GN does not exploit the batch dimension, and its computation is independent of batch sizes. In the case where the group size is 1, it is equivalent to Instance Normalization.

As motivation for the method, many classical features like SIFT and HOG had group-wise features and involved group-wise normalization. For example, a HOG vector is the outcome of several spatial cells where each cell is represented by a normalized orientation histogram.

Formally, Group Normalization is defined as:

$$ \mu_{i} = \frac{1}{m}\sum_{k\in\mathcal{S}_{i}}x_{k} $$

$$ \sigma^{2}_{i} = \frac{1}{m}\sum_{k\in\mathcal{S}_{i}}\left(x_{k}-\mu_{i}\right)^{2} $$

$$ \hat{x}_{i} = \frac{x_{i} - \mu_{i}}{\sqrt{\sigma^{2}_{i}+\epsilon}} $$

Here $x$ is the feature computed by a layer, and $i$ is an index. Formally, a Group Norm layer computes $\mu$ and $\sigma$ in a set $\mathcal{S}_{i}$ defined as: $\mathcal{S}_{i} = ${$k \mid k_{N} = i_{N} ,\lfloor\frac{k_{C}}{C/G}\rfloor = \lfloor\frac{I_{C}}{C/G}\rfloor $}.

Here $G$ is the number of groups, which is a pre-defined hyper-parameter ($G = 32$ by default). $C/G$ is the number of channels per group. $\lfloor$ is the floor operation, and the final term means that the indexes $i$ and $k$ are in the same group of channels, assuming each group of channels are stored in a sequential order along the $C$ axis.

Source: Group Normalization

Papers


Paper Code Results Date Stars

Tasks


Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories