Efficient Channel Attention is an architectural unit based on squeeze-and-excitation blocks that reduces model complexity without dimensionality reduction. It was proposed as part of the ECA-Net CNN architecture.
After channel-wise global average pooling without dimensionality reduction, the ECA captures local cross-channel interaction by considering every channel and its $k$ neighbors. The ECA can be efficiently implemented by fast $1D$ convolution of size $k$, where kernel size $k$ represents the coverage of local cross-channel interaction, i.e., how many neighbors participate in attention prediction of one channel.
Source: ECA-Net: Efficient Channel Attention for Deep Convolutional Neural NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Object Detection | 4 | 14.29% |
Image Classification | 3 | 10.71% |
Classification | 2 | 7.14% |
Instance Segmentation | 2 | 7.14% |
Semantic Segmentation | 2 | 7.14% |
Marketing | 1 | 3.57% |
Saliency Prediction | 1 | 3.57% |
medical image detection | 1 | 3.57% |
Medical Object Detection | 1 | 3.57% |