Activation Functions

Swish

Introduced by Ramachandran et al. in Searching for Activation Functions

Swish is an activation function, $f(x) = x \cdot \text{sigmoid}(\beta x)$, where $\beta$ a learnable parameter. Nearly all implementations do not use the learnable parameter $\beta$, in which case the activation function is $x\sigma(x)$ ("Swish-1").

The function $x\sigma(x)$ is exactly the SiLU, which was introduced by other authors before the swish. See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the same activation function was experimented with later.

Source: Searching for Activation Functions

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 74 13.50%
Object Detection 34 6.20%
General Classification 27 4.93%
Classification 25 4.56%
Semantic Segmentation 24 4.38%
Instance Segmentation 11 2.01%
Decoder 9 1.64%
Multi-Task Learning 9 1.64%
Quantization 7 1.28%

Categories