Transformers

Sinkhorn Transformer

Introduced by Tay et al. in Sparse Sinkhorn Attention

The Sinkhorn Transformer is a type of transformer that uses Sparse Sinkhorn Attention as a building block. This component is a plug-in replacement for dense fully-connected attention (as well as local attention, and sparse attention alternatives), and allows for reduced memory complexity as well as sparse attention.

Source: Sparse Sinkhorn Attention

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Document Classification 1 25.00%
Image Generation 1 25.00%
Language Modelling 1 25.00%
Natural Language Inference 1 25.00%

Categories