Search Results

Full Resolution Image Compression with Recurrent Neural Networks

7 code implementations CVPR 2017

As far as we know, this is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.

Image Compression

Adaptive Computation Time for Recurrent Neural Networks

5 code implementations29 Mar 2016

This paper introduces Adaptive Computation Time (ACT), an algorithm that allows recurrent neural networks to learn how many computational steps to take between receiving an input and emitting an output.

Language Modelling

Session-based Recommendations with Recurrent Neural Networks

25 code implementations21 Nov 2015

We apply recurrent neural networks (RNN) on a new domain, namely recommender systems.

Session-Based Recommendations

Generating Sequences With Recurrent Neural Networks

57 code implementations4 Aug 2013

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.

Language Modelling Text Generation

Recurrent Neural Networks Hardware Implementation on FPGA

1 code implementation17 Nov 2015

Recurrent Neural Networks (RNNs) have the ability to retain memory and learn data sequences.

Language Modelling

Spectral Pruning for Recurrent Neural Networks

1 code implementation23 May 2021

Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.

Edge-computing

A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction

14 code implementations7 Apr 2017

The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series, has been studied for decades.

Time Series Time Series Prediction

Quaternion Recurrent Neural Networks

3 code implementations ICLR 2019

Recurrent neural networks (RNNs) are powerful architectures to model sequential data, due to their capability to learn short and long-term dependencies between the basic elements of a sequence.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +1

Quasi-Recurrent Neural Networks

8 code implementations5 Nov 2016

Recurrent neural networks are a powerful tool for modeling sequential data, but the dependence of each timestep's computation on the previous timestep's output limits parallelism and makes RNNs unwieldy for very long sequences.

Language Modelling Machine Translation +4