Sequential Image Classification

37 papers with code • 3 benchmarks • 3 datasets

Sequential image classification is the task of classifying a sequence of images.

( Image credit: TensorFlow-101 )

Libraries

Use these libraries to find Sequential Image Classification models and implementations
3 papers
55

Most implemented papers

R-Transformer: Recurrent Neural Network Enhanced Transformer

DSE-MSU/R-transformer ICLR 2020

Recurrent Neural Networks have long been the dominating choice for sequence modeling.

Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks

abr/neurips2019 NeurIPS 2019

Backpropagation through the ODE solver allows each layer to adapt its internal time-step, enabling the network to learn task-relevant time-scales.

Learning Long-Term Dependencies in Irregularly-Sampled Time Series

mlech26l/learning-long-term-irregular-ts NeurIPS 2020

These models, however, face difficulties when the input data possess long-term dependencies.

HiPPO: Recurrent Memory with Optimal Polynomial Projections

HazyResearch/hippo-code NeurIPS 2020

A central problem in learning from sequential data is representing cumulative history in an incremental fashion as more data is processed.

Parallelizing Legendre Memory Unit Training

hrshtv/pytorch-lmu 22 Feb 2021

For instance, our LMU sets a new state-of-the-art result on psMNIST, and uses half the parameters while outperforming DistilBERT and LSTM models on IMDB sentiment analysis.

Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers

hazyresearch/state-spaces NeurIPS 2021

Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations (NDEs) are popular families of deep learning models for time-series data, each with unique strengths and tradeoffs in modeling power and computational efficiency.

Long short-term memory and learning-to-learn in networks of spiking neurons

IGITUGraz/LSNN-official NeurIPS 2018

Recurrent networks of spiking neurons (RSNNs) underlie the astounding computing and learning capabilities of the brain.

Trellis Networks for Sequence Modeling

locuslab/trellisnet ICLR 2019

On the other hand, we show that truncated recurrent networks are equivalent to trellis networks with special sparsity structure in their weight matrices.

Learning to Remember More with Less Memorization

thaihungle/UW-DNC ICLR 2019

Memory-augmented neural networks consisting of a neural controller and an external memory have shown potentials in long-term sequential learning.

AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks

KurochkinAlexey/AntisymmetricRNN ICLR 2019

In this paper, we draw connections between recurrent networks and ordinary differential equations.