Attention Modules

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

METHOD YEAR PAPERS
Multi-Head Attention
2017 2866
SAGAN Self-Attention Module
2018 41
Spatial Attention Module
2018 26
Channel Attention Module
2018 14
DV3 Attention Block
2017 10
Spatial Attention-Guided Mask
2019 5
LAMA
2019 4
Global Context Block
2019 3
Single-Headed Attention
2019 2
Multi-Head Linear Attention
2020 2
Graph Self-Attention
2019 2
Point-wise Spatial Attention
2018 2
DeLighT Block
2020 2
Feedback Memory
2020 2
Attention-augmented Convolution
2019 1
CBAM
2018 1
Compact Global Descriptor
2019 1
All-Attention Layer
2019 1
Hopfield Layer
2020 1