no code implementations • 1 Apr 2024 • Yassine Hamdi, Aaron B. Wagner, Deniz Gündüz
The per-symbol near-perfect realism constraint requires that the TVD between the distribution of output symbol $Y_t$ and the source distribution be arbitrarily small, uniformly in the index $t.$ We characterize the corresponding asymptotic rate-distortion trade-off and show that encoder private randomness is not useful if the compression rate is lower than the entropy of the source, however limited the resources in terms of common randomness and decoder private randomness may be.
no code implementations • 5 Oct 2023 • Yang Qiu, Aaron B. Wagner, Johannes Ballé, Lucas Theis
We introduce a distortion measure for images, Wasserstein distortion, that simultaneously generalizes pixel-level fidelity on the one hand and realism or perceptual quality on the other.
no code implementations • 17 May 2022 • Sourbh Bhadane, Aaron B. Wagner, Johannes Ballé
Artificial Neural-Network-based (ANN-based) lossy compressors have recently obtained striking results on several sources.
no code implementations • 10 Feb 2022 • Sourbh Bhadane, Aaron B. Wagner
We consider the one-bit quantizer that minimizes the mean squared error for a source living in a real Hilbert space.
no code implementations • 8 Feb 2022 • Aaron B. Wagner
A rate-distortion-perception (RDP) tradeoff has recently been proposed by Blau and Michaeli and also Matsumoto.
1 code implementation • 5 Jun 2021 • Sourbh Bhadane, Aaron B. Wagner, Jayadev Acharya
As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call \emph{Principal Bit Analysis (PBA)}.
no code implementations • ICLR Workshop Neural_Compression 2021 • Lucas Theis, Aaron B. Wagner
The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression.