Search Results for author: David Minnen

Found 18 papers, 8 papers with code

Finite Scalar Quantization: VQ-VAE Made Simple

3 code implementations27 Sep 2023 Fabian Mentzer, David Minnen, Eirikur Agustsson, Michael Tschannen

Each dimension is quantized to a small set of fixed values, leading to an (implicit) codebook given by the product of these sets.

Colorization Depth Estimation +4

Advancing The Rate-Distortion-Computation Frontier For Neural Image Compression

no code implementations26 Sep 2023 David Minnen, Nick Johnston

The rate-distortion performance of neural image compression models has exceeded the state-of-the-art for non-learned codecs, but neural codecs are still far from widespread deployment and adoption.

Benchmarking Image Compression

Multi-Realism Image Compression with a Conditional Generator

1 code implementation CVPR 2023 Eirikur Agustsson, David Minnen, George Toderici, Fabian Mentzer

By optimizing the rate-distortion-realism trade-off, generative compression approaches produce detailed, realistic images, even at low bit rates, instead of the blurry reconstructions produced by rate-distortion optimized models.

Image Compression Navigate

VCT: A Video Compression Transformer

1 code implementation15 Jun 2022 Fabian Mentzer, George Toderici, David Minnen, Sung-Jin Hwang, Sergi Caelles, Mario Lucic, Eirikur Agustsson

The resulting video compression transformer outperforms previous methods on standard video compression data sets.

motion prediction Video Compression

Neural Video Compression using GANs for Detail Synthesis and Propagation

no code implementations26 Jul 2021 Fabian Mentzer, Eirikur Agustsson, Johannes Ballé, David Minnen, Nick Johnston, George Toderici

Our approach significantly outperforms previous neural and non-neural video compression methods in a user study, setting a new state-of-the-art in visual quality for neural methods.

Video Compression

Channel-wise Autoregressive Entropy Models for Learned Image Compression

2 code implementations17 Jul 2020 David Minnen, Saurabh Singh

In learning-based approaches to image compression, codecs are developed by optimizing a computational model to minimize a rate-distortion objective.

Image Compression

Joint Autoregressive and Hierarchical Priors for Learned Image Compression

3 code implementations NeurIPS 2018 David Minnen, Johannes Ballé, George Toderici

While it is well known that autoregressive models come with a significant computational penalty, we find that in terms of compression performance, autoregressive and hierarchical priors are complementary and, together, exploit the probabilistic structure in the latents better than all previous learned models.

Image Compression MS-SSIM +1

Towards a Semantic Perceptual Image Metric

no code implementations1 Aug 2018 Troy Chinen, Johannes Ballé, Chunhui Gu, Sung Jin Hwang, Sergey Ioffe, Nick Johnston, Thomas Leung, David Minnen, Sean O'Malley, Charles Rosenberg, George Toderici

We present a full reference, perceptual image metric based on VGG-16, an artificial neural network trained on object classification.

Image Quality Assessment

Image-Dependent Local Entropy Models for Learned Image Compression

no code implementations31 May 2018 David Minnen, George Toderici, Saurabh Singh, Sung Jin Hwang, Michele Covell

The leading approach for image compression with artificial neural networks (ANNs) is to learn a nonlinear transform and a fixed entropy model that are optimized for rate-distortion performance.

Image Compression

Spatially adaptive image compression using a tiled deep network

no code implementations7 Feb 2018 David Minnen, George Toderici, Michele Covell, Troy Chinen, Nick Johnston, Joel Shor, Sung Jin Hwang, Damien Vincent, Saurabh Singh

Deep neural networks represent a powerful class of function approximators that can learn to compress and reconstruct images.

Image Compression

Target-Quality Image Compression with Recurrent, Convolutional Neural Networks

no code implementations18 May 2017 Michele Covell, Nick Johnston, David Minnen, Sung Jin Hwang, Joel Shor, Saurabh Singh, Damien Vincent, George Toderici

Our methods introduce a multi-pass training method to combine the training goals of high-quality reconstructions in areas around stop-code masking as well as in highly-detailed areas.

Image Compression

Improved Lossy Image Compression with Priming and Spatially Adaptive Bit Rates for Recurrent Networks

no code implementations CVPR 2018 Nick Johnston, Damien Vincent, David Minnen, Michele Covell, Saurabh Singh, Troy Chinen, Sung Jin Hwang, Joel Shor, George Toderici

We propose a method for lossy image compression based on recurrent, convolutional neural networks that outperforms BPG (4:2:0 ), WebP, JPEG2000, and JPEG as measured by MS-SSIM.

Image Compression MS-SSIM +1

Full Resolution Image Compression with Recurrent Neural Networks

7 code implementations CVPR 2017 George Toderici, Damien Vincent, Nick Johnston, Sung Jin Hwang, David Minnen, Joel Shor, Michele Covell

As far as we know, this is the first neural network architecture that is able to outperform JPEG at image compression across most bitrates on the rate-distortion curve on the Kodak dataset images, with and without the aid of entropy coding.

Image Compression

Variable Rate Image Compression with Recurrent Neural Networks

1 code implementation19 Nov 2015 George Toderici, Sean M. O'Malley, Sung Jin Hwang, Damien Vincent, David Minnen, Shumeet Baluja, Michele Covell, Rahul Sukthankar

A large fraction of Internet traffic is now driven by requests from mobile devices with relatively small screens and often stringent bandwidth requirements.

Image Compression Image Reconstruction

Cannot find the paper you are looking for? You can Submit a new open access paper.