Bi-attention employs the attention-in-attention (AiA) mechanism to capture second-order statistical information: the outer point-wise channel attention vectors are computed from the output of the inner channel attention.
Source: Bilinear Attention Networks for Person RetrievalPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
3D Reconstruction | 1 | 7.14% |
Depth Estimation | 1 | 7.14% |
ERP | 1 | 7.14% |
Disease Prediction | 1 | 7.14% |
Decision Making | 1 | 7.14% |
Knowledge Graphs | 1 | 7.14% |
Sentence | 1 | 7.14% |
Few-Shot Learning | 1 | 7.14% |
Metric Learning | 1 | 7.14% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |