As far as we know, this is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.
This paper introduces Adaptive Computation Time (ACT), an algorithm that allows recurrent neural networks to learn how many computational steps to take between receiving an input and emitting an output.
We apply recurrent neural networks (RNN) on a new domain, namely recommender systems.
Ranked #11 on Session-Based Recommendations on yoochoose1/64
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.
Ranked #41 on Language Modelling on enwik8
Recurrent Neural Networks (RNNs) have the ability to retain memory and learn data sequences.
Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.
Spatiotemporal forecasting has various applications in neuroscience, climate and transportation domain.
Ranked #13 on Traffic Prediction on PEMS-BAY
Multivariate Time Series Forecasting Spatio-Temporal Forecasting +2
The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series, has been studied for decades.
Recurrent neural networks (RNNs) are powerful architectures to model sequential data, due to their capability to learn short and long-term dependencies between the basic elements of a sequence.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
Recurrent neural networks are a powerful tool for modeling sequential data, but the dependence of each timestep's computation on the previous timestep's output limits parallelism and makes RNNs unwieldy for very long sequences.
Ranked #15 on Machine Translation on IWSLT2015 German-English