Activation Functions

Gated Linear Unit

Introduced by Dauphin et al. in Language Modeling with Gated Convolutional Networks

A Gated Linear Unit, or GLU computes:

$$ \text{GLU}\left(a, b\right) = a\otimes \sigma\left(b\right) $$

It is used in natural language processing architectures, for example the Gated CNN, because here $b$ is the gate that control what information from $a$ is passed up to the following layer. Intuitively, for a language modeling task, the gating mechanism allows selection of words or features that are important for predicting the next word. The GLU also has non-linear capabilities, but has a linear path for the gradient so diminishes the vanishing gradient problem.

Source: Language Modeling with Gated Convolutional Networks

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 94 9.05%
Question Answering 60 5.77%
Decoder 48 4.62%
Sentence 39 3.75%
Text Generation 38 3.66%
Retrieval 33 3.18%
Translation 27 2.60%
Machine Translation 23 2.21%
Natural Language Understanding 20 1.92%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories