Activation Functions

HardELiSH

Introduced by Basirat et al. in The Quest for the Golden Activation Function

HardELiSH is an activation function for neural networks. The HardELiSH is a multiplication of the HardSigmoid and ELU in the negative part and a multiplication of the Linear and the HardSigmoid in the positive part:

$$f\left(x\right) = x\max\left(0, \min\left(1, \left(\frac{x+1}{2}\right)\right) \right) \text{ if } x \geq 1$$ $$f\left(x\right) = \left(e^{x}-1\right)\max\left(0, \min\left(1, \left(\frac{x+1}{2}\right)\right)\right) \text{ if } x < 0 $$

Source: Activation Functions

Source: The Quest for the Golden Activation Function

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Image Classification 1 100.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories