Normalization

Spectral Normalization

Introduced by Miyato et al. in Spectral Normalization for Generative Adversarial Networks

Spectral Normalization is a normalization technique used for generative adversarial networks, used to stabilize training of the discriminator. Spectral normalization has the convenient property that the Lipschitz constant is the only hyper-parameter to be tuned.

It controls the Lipschitz constant of the discriminator $f$ by constraining the spectral norm of each layer $g : \textbf{h}_{in} \rightarrow \textbf{h}_{out}$. The Lipschitz norm $\Vert{g}\Vert_{\text{Lip}}$ is equal to $\sup_{\textbf{h}}\sigma\left(\nabla{g}\left(\textbf{h}\right)\right)$, where $\sigma\left(a\right)$ is the spectral norm of the matrix $A$ ($L_{2}$ matrix norm of $A$):

$$ \sigma\left(a\right) = \max_{\textbf{h}:\textbf{h}\neq{0}}\frac{\Vert{A\textbf{h}}\Vert_{2}}{\Vert\textbf{h}\Vert_{2}} = \max_{\Vert\textbf{h}\Vert_{2}\leq{1}}{\Vert{A\textbf{h}}\Vert_{2}} $$

which is equivalent to the largest singular value of $A$. Therefore for a linear layer $g\left(\textbf{h}\right) = W\textbf{h}$ the norm is given by $\Vert{g}\Vert_{\text{Lip}} = \sup_{\textbf{h}}\sigma\left(\nabla{g}\left(\textbf{h}\right)\right) = \sup_{\textbf{h}}\sigma\left(W\right) = \sigma\left(W\right) $. Spectral normalization normalizes the spectral norm of the weight matrix $W$ so it satisfies the Lipschitz constraint $\sigma\left(W\right) = 1$:

$$ \bar{W}_{\text{SN}}\left(W\right) = W / \sigma\left(W\right) $$

Source: Spectral Normalization for Generative Adversarial Networks

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories