no code implementations • 10 Nov 2019 • Chao Yu, Zhiguo Su
In this paper, a novel neural network activation function, called Symmetrical Gaussian Error Linear Unit (SGELU), is proposed to obtain high performance.