Activation Functions

ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.

Image Credit: PyTorch

Source: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories