Activation Functions

Scaled Exponential Linear Unit

Introduced by Klambauer et al. in Self-Normalizing Neural Networks

Scaled Exponential Linear Units, or SELUs, are activation functions that induce self-normalizing properties.

The SELU activation function is given by

$$f\left(x\right) = \lambda{x} \text{ if } x \geq{0}$$ $$f\left(x\right) = \lambda{\alpha\left(\exp\left(x\right) -1 \right)} \text{ if } x < 0 $$

with $\alpha \approx 1.6733$ and $\lambda \approx 1.0507$.

Source: Self-Normalizing Neural Networks

Papers


Paper Code Results Date Stars

Tasks


Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories