Sequential image classification is the task of classifying a sequence of images.
( Image credit: TensorFlow-101 )
Our results indicate that a simple convolutional architecture outperforms canonical recurrent networks such as LSTMs across a diverse range of tasks and datasets, while demonstrating longer effective memory.
Ranked #2 on
Music Modeling
on Nottingham
LANGUAGE MODELLING MACHINE TRANSLATION MUSIC MODELING SEQUENTIAL IMAGE CLASSIFICATION
Experimental results have shown that the proposed IndRNN is able to process very long sequences (over 5000 time steps), can be used to construct very deep networks (21 layers used in the experiment) and still be trained robustly.
Ranked #7 on
Sequential Image Classification
on Sequential MNIST
LANGUAGE MODELLING SEQUENTIAL IMAGE CLASSIFICATION SKELETON BASED ACTION RECOGNITION
On the other hand, we show that truncated recurrent networks are equivalent to trellis networks with special sparsity structure in their weight matrices.
To provide a theory-based quantification of the architecture's advantages, we introduce a memory capacity measure, the mean recurrent length, which is more suitable for RNNs with long skip connections than existing measures.
Ranked #9 on
Sequential Image Classification
on Sequential MNIST
Recurrent Neural Networks have long been the dominating choice for sequence modeling.
Ranked #1 on
Music Modeling
on Nottingham
LANGUAGE MODELLING MUSIC MODELING SEQUENTIAL IMAGE CLASSIFICATION
Backpropagation through the ODE solver allows each layer to adapt its internal time-step, enabling the network to learn task-relevant time-scales.
Ranked #4 on
Sequential Image Classification
on Sequential MNIST
SEQUENTIAL IMAGE CLASSIFICATION TIME SERIES TIME SERIES PREDICTION
Recurrent neural networks (RNNs) are known to be difficult to train due to the gradient vanishing and exploding problems and thus difficult to learn long-term patterns and construct deep networks.
Ranked #4 on
Sequential Image Classification
on Sequential MNIST
LANGUAGE MODELLING SEQUENTIAL IMAGE CLASSIFICATION SKELETON BASED ACTION RECOGNITION
We propose a reparameterization of LSTM that brings the benefits of batch normalization to recurrent neural networks.
Ranked #8 on
Sequential Image Classification
on Sequential MNIST
LANGUAGE MODELLING QUESTION ANSWERING READING COMPREHENSION SEQUENTIAL IMAGE CLASSIFICATION
To address this question, we propose full-capacity uRNNs that optimize their recurrence matrix over all unitary matrices, leading to significantly improved performance over uRNNs that use a restricted-capacity recurrence matrix.
Ranked #10 on
Sequential Image Classification
on Sequential MNIST
When the eigenvalues of the hidden to hidden weight matrix deviate from absolute value 1, optimization becomes difficult due to the well studied issue of vanishing and exploding gradients, especially when trying to learn long-term dependencies.
Ranked #11 on
Sequential Image Classification
on Sequential MNIST