Recurrent Neural Networks

A Convolutional Gated Recurrent Unit is a type of GRU that combines GRUs with the convolution operation. The update rule for input $x_{t}$ and the previous output $h_{t-1}$ is given by the following:

$$ r = \sigma\left(W_{r} \star_{n}\left[h_{t-1};x_{t}\right] + b_{r}\right) $$

$$ u = \sigma\left(W_{u} \star_{n}\left[h_{t-1};x_{t}\right] + b_{u} \right) $$

$$ c = \rho\left(W_{c} \star_{n}\left[x_{t}; r \odot h_{t-1}\right] + b_{c} \right) $$

$$ h_{t} = u \odot h_{t-1} + \left(1-u\right) \odot c $$

In these equations $\sigma$ and $\rho$ are the elementwise sigmoid and ReLU functions respectively and the $\star_{n}$ represents a convolution with a kernel of size $n \times n$. Brackets are used to represent a feature concatenation.

Source: Delving Deeper into Convolutional Networks for Learning Video Representations

Papers


Paper Code Results Date Stars

Categories