Attention Mechanisms

GeneralAttention • 82 methods

Attention Mechanisms are a component used in neural networks to model long-range interaction, for example across a text in NLP. The key idea is to build shortcuts between a context vector and the input, to allow a model to attend to different parts. Below you can find a continuously updating list of attention mechanisms.

Subcategories

Method Year Papers
2017 16848
2019 1282
2019 1281
2017 253
2015 216
2017 212
2014 193
2017 149
2022 107
2014 100
2020 83
2020 77
2020 76
2020 71
2019 65
2020 63
2019 52
2021 43
2020 43
2018 37
2017 36
2014 32
2021 32
2015 31
2015 31
2015 24
2020 24
2018 24
2019 21
2021 21
2015 19
2020 18
2019 18
2020 17
2018 15
2020 14
2022 12
2020 11
2019 9
2018 9
2019 7
2020 7
2019 6
2018 6
2015 5
2021 5
2018 4
2020 4
2018 3
2016 3
2015 3
2020 3
2017 2
2020 2
2016 2
2018 2
2021 2
2020 2
2017 2
2021 2
2019 2
2020 2
2021 2
2020 1
2021 1
2021 1
2020 1
2018 1
2018 1
2020 1
2022 1
2016 1
2022 1
2020 1
2021 1
2021 1
2020 1
2020 1
2000 0