Content-based attention is an attention mechanism based on cosine similarity:
$$f_{att}\left(\textbf{h}_{i}, \textbf{s}_{j}\right) = \cos\left[\textbf{h}_{i};\textbf{s}_{j}\right] $$
It was utilised in Neural Turing Machines as part of the Addressing Mechanism.
We produce a normalized attention weighting by taking a softmax over these attention alignment scores.
Source: Neural Turing MachinesPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Question Answering | 5 | 10.42% |
Speech Recognition | 3 | 6.25% |
Machine Translation | 3 | 6.25% |
Translation | 3 | 6.25% |
Automatic Speech Recognition (ASR) | 2 | 4.17% |
Retrieval | 2 | 4.17% |
Image Classification | 2 | 4.17% |
Sentence | 2 | 4.17% |
BIG-bench Machine Learning | 2 | 4.17% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |