Image Model Blocks

Efficient Channel Attention

Introduced by Wang et al. in ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

Efficient Channel Attention is an architectural unit based on squeeze-and-excitation blocks that reduces model complexity without dimensionality reduction. It was proposed as part of the ECA-Net CNN architecture.

After channel-wise global average pooling without dimensionality reduction, the ECA captures local cross-channel interaction by considering every channel and its $k$ neighbors. The ECA can be efficiently implemented by fast $1D$ convolution of size $k$, where kernel size $k$ represents the coverage of local cross-channel interaction, i.e., how many neighbors participate in attention prediction of one channel.

Source: ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

Papers


Paper Code Results Date Stars

Categories