no code implementations • 23 Oct 2020 • Yurika Sakai, Andrey Kormilitzin, Qiang Liu, Alejo Nevado-Holgado
The most successful methods such as ReLU transfer functions, batch normalization, Xavier initialization, dropout, learning rate decay, or dynamic optimizers, have become standards in the field due, particularly, to their ability to increase the performance of Neural Networks (NNs) significantly and in almost all situations.