Sequential Image Classification
37 papers with code • 3 benchmarks • 3 datasets
Sequential image classification is the task of classifying a sequence of images.
( Image credit: TensorFlow-101 )
Libraries
Use these libraries to find Sequential Image Classification models and implementationsMost implemented papers
R-Transformer: Recurrent Neural Network Enhanced Transformer
Recurrent Neural Networks have long been the dominating choice for sequence modeling.
Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
Backpropagation through the ODE solver allows each layer to adapt its internal time-step, enabling the network to learn task-relevant time-scales.
Learning Long-Term Dependencies in Irregularly-Sampled Time Series
These models, however, face difficulties when the input data possess long-term dependencies.
HiPPO: Recurrent Memory with Optimal Polynomial Projections
A central problem in learning from sequential data is representing cumulative history in an incremental fashion as more data is processed.
Parallelizing Legendre Memory Unit Training
For instance, our LMU sets a new state-of-the-art result on psMNIST, and uses half the parameters while outperforming DistilBERT and LSTM models on IMDB sentiment analysis.
Combining Recurrent, Convolutional, and Continuous-time Models with Linear State-Space Layers
Recurrent neural networks (RNNs), temporal convolutions, and neural differential equations (NDEs) are popular families of deep learning models for time-series data, each with unique strengths and tradeoffs in modeling power and computational efficiency.
Long short-term memory and learning-to-learn in networks of spiking neurons
Recurrent networks of spiking neurons (RSNNs) underlie the astounding computing and learning capabilities of the brain.
Trellis Networks for Sequence Modeling
On the other hand, we show that truncated recurrent networks are equivalent to trellis networks with special sparsity structure in their weight matrices.
Learning to Remember More with Less Memorization
Memory-augmented neural networks consisting of a neural controller and an external memory have shown potentials in long-term sequential learning.
AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks
In this paper, we draw connections between recurrent networks and ordinary differential equations.