Hard Swish

Introduced by Howard et al. in Searching for MobileNetV3

Hard Swish is a type of activation function based on Swish, but replaces the computationally expensive sigmoid with a piecewise linear analogue:

$$\text{h-swish}\left(x\right) = x\frac{\text{ReLU6}\left(x+3\right)}{6} $$

Source: Searching for MobileNetV3

Latest Papers

PAPER DATE
Detecting soccer balls with reduced neural networks: a comparison of multiple architectures under constrained hardware scenarios
Douglas De Rizzo MeneghettiThiago Pedro Donadon HomemJonas Henrique Renolfi de OliveiraIsaac Jesus da SilvaDanilo Hernani PericoReinaldo Augusto da Costa Bianchi
2020-09-28
PareCO: Pareto-aware Channel Optimization for Slimmable Neural Networks
Ting-Wu ChinAri S. MorcosDiana Marculescu
2020-07-23
Fine-Grained Stochastic Architecture Search
Shraman Ray ChaudhuriElad EbanHanhan LiMax MorozYair Movshovitz-Attias
2020-06-17
GhostNet: More Features from Cheap Operations
| Kai HanYunhe WangQi TianJianyuan GuoChunjing XuChang Xu
2019-11-27
Structured Multi-Hashing for Model Compression
Elad EbanYair Movshovitz-AttiasHao WuMark SandlerAndrew PoonYerlan IdelbayevMiguel A. Carreira-Perpinan
2019-11-25
Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation
| Bowen ChengMaxwell D. CollinsYukun ZhuTing LiuThomas S. HuangHartwig AdamLiang-Chieh Chen
2019-11-22
Street Crossing Aid Using Light-weight CNNs for the Visually Impaired
| Samuel YuHeon LeeJung Hoon Kim
2019-09-14
MoGA: Searching Beyond MobileNetV3
| Xiangxiang ChuBo ZhangRuijun Xu
2019-08-04
Butterfly Transform: An Efficient FFT Based Neural Architecture Design
| Keivan Alizadeh VahidAnish PrabhuAli FarhadiMohammad Rastegari
2019-06-05
Searching for MobileNetV3
| Andrew HowardMark SandlerGrace ChuLiang-Chieh ChenBo ChenMingxing TanWeijun WangYukun ZhuRuoming PangVijay VasudevanQuoc V. LeHartwig Adam
2019-05-06

Tasks

Components

COMPONENT TYPE
ReLU6
Activation Functions

Categories