Regularization

rnnDrop

rnnDrop is a dropout based regularization technique for recurrent neural networks. It amounts to using the same dropout mask at every timestep. It drops both the non-recurrent and recurrent connections. A simple figure to explain the idea is shown to the right. The figure shows an RNN being trained with rnnDrop for three frames $\left(t-1, t, t+1\right)$ on two different training sequences in the data (denoted as ‘sequence1’ and ‘sequence2’). The black circles denote the randomly omitted hidden nodes during training, and the dotted arrows stand for the model weights connected to those omitted nodes.

From: RnnDrop: A Novel Dropout for RNNs in ASR by Moon et al

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories