Layer-Sequential Unit-Variance Initialization (LSUV) is a simple method for weight initialization for deep net learning. The initialization strategy involves the following two step:
1) First, pre-initialize weights of each convolution or inner-product layer with orthonormal matrices.
2) Second, proceed from the first to the final layer, normalizing the variance of the output of each layer to be equal to one.
Source: All you need is a good initPaper | Code | Results | Date | Stars |
---|
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |