Attention Modules

GeneralAttention • 42 methods

Attention Modules refer to modules that incorporate attention mechanisms. For example, multi-head attention is a module that incorporates multiple attention heads. Below you can find a continuously updating list of attention modules.

Method Year Papers
2017 17245
2018 176
2018 135
2021 81
2020 68
2018 65
2019 47
2020 37
2019 36
2020 32
2021 22
2022 22
2018 19
2020 17
2022 13
2019 11
2021 11
2017 9
2019 7
2021 7
2020 7
2019 5
2021 4
2020 4
2018 3
2020 3
2020 3
2021 3
2021 3
2021 3
2019 2
2021 2
2021 2
2019 1
2019 1
2020 1
2020 1
2020 1
2020 1
2021 1
2021 1