Sequential Image Classification

37 papers with code • 3 benchmarks • 3 datasets

Sequential image classification is the task of classifying a sequence of images.

( Image credit: TensorFlow-101 )

Libraries

Use these libraries to find Sequential Image Classification models and implementations
3 papers
56

Most implemented papers

An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling

locuslab/TCN 4 Mar 2018

Our results indicate that a simple convolutional architecture outperforms canonical recurrent networks such as LSTMs across a diverse range of tasks and datasets, while demonstrating longer effective memory.

Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

Sunnydreamrain/IndRNN_Theano_Lasagne CVPR 2018

Experimental results have shown that the proposed IndRNN is able to process very long sequences (over 5000 time steps), can be used to construct very deep networks (21 layers used in the experiment) and still be trained robustly.

Efficiently Modeling Long Sequences with Structured State Spaces

hazyresearch/state-spaces ICLR 2022

A central goal of sequence modeling is designing a single principled model that can address sequence data across a range of modalities and tasks, particularly on long-range dependencies.

Resurrecting Recurrent Neural Networks for Long Sequences

bojone/rnn 11 Mar 2023

Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are hard to optimize and slow to train.

A Simple Way to Initialize Recurrent Networks of Rectified Linear Units

facebookresearch/salina 3 Apr 2015

Learning long term dependencies in recurrent networks is difficult due to vanishing and exploding gradients.

Recurrent Batch Normalization

cooijmanstim/recurrent-batch-normalization 30 Mar 2016

We propose a reparameterization of LSTM that brings the benefits of batch normalization to recurrent neural networks.

Gating Revisited: Deep Multi-layer RNNs That Can Be Trained

0zgur0/STAR_Network 25 Nov 2019

We propose a new STAckable Recurrent cell (STAR) for recurrent neural networks (RNNs), which has fewer parameters than widely used LSTM and GRU while being more robust against vanishing or exploding gradients.

Unitary Evolution Recurrent Neural Networks

v0lta/Complex-gated-recurrent-neural-networks 20 Nov 2015

When the eigenvalues of the hidden to hidden weight matrix deviate from absolute value 1, optimization becomes difficult due to the well studied issue of vanishing and exploding gradients, especially when trying to learn long-term dependencies.

Full-Capacity Unitary Recurrent Neural Networks

stwisdom/urnn NeurIPS 2016

To address this question, we propose full-capacity uRNNs that optimize their recurrence matrix over all unitary matrices, leading to significantly improved performance over uRNNs that use a restricted-capacity recurrence matrix.

Dilated Recurrent Neural Networks

code-terminator/DilatedRNN NeurIPS 2017

To provide a theory-based quantification of the architecture's advantages, we introduce a memory capacity measure, the mean recurrent length, which is more suitable for RNNs with long skip connections than existing measures.