Audio Model Blocks

Beneš Block with Residual Switch Units

Introduced by Draguns et al. in Residual Shuffle-Exchange Networks for Fast Processing of Long Sequences

The Beneš block is a computation-efficient alternative to dense attention, enabling the modelling of long-range dependencies in O(n log n) time. In comparison, dense attention which is commonly used in Transformers has O(n^2) complexity.

In music, dependencies occur on several scales, including on a coarse scale which requires processing very long sequences. Beneš blocks have been used in Residual Shuffle-Exchange Networks to achieve state-of-the-art results in music transcription.

Beneš blocks have a ‘receptive field’ of the size of the whole sequence, and it has no bottleneck. These properties hold for dense attention but have not been shown for many sparse attention and dilated convolutional architectures.

Source: Residual Shuffle-Exchange Networks for Fast Processing of Long Sequences

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
LAMBADA 1 33.33%
Language Modelling 1 33.33%
Music Transcription 1 33.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories