ReLU6 is a modification of the rectified linear unit where we limit the activation to a maximum size of $6$. This is due to increased robustness when used with low-precision computation.
Image Credit: PyTorch
Source: MobileNets: Efficient Convolutional Neural Networks for Mobile Vision ApplicationsPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 15 | 15.63% |
Object Detection | 10 | 10.42% |
Classification | 6 | 6.25% |
Quantization | 5 | 5.21% |
Semantic Segmentation | 5 | 5.21% |
Bayesian Optimization | 3 | 3.13% |
Computational Efficiency | 3 | 3.13% |
Neural Network Compression | 2 | 2.08% |
Network Pruning | 2 | 2.08% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |