1 code implementation • 16 May 2023 • Daniel Severo, James Townsend, Ashish Khisti, Alireza Makhzani
We present a one-shot method for compressing large labeled graphs called Random Edge Coding.
1 code implementation • 2 Nov 2022 • James Townsend, Jan-Willem van de Meent
Agda supports formal verification of program properties, and the compiler for our reversible language (which is implemented as an Agda macro), produces not just an encoder/decoder pair of functions but also a proof that they are inverse to one another.
2 code implementations • 13 Jan 2022 • Mingtian Zhang, James Townsend, Ning Kang, David Barber
The recently proposed Neural Local Lossless Compression (NeLLoC), which is based on a local autoregressive model, has achieved state-of-the-art (SOTA) out-of-distribution (OOD) generalization performance in the image compression task.
1 code implementation • 30 Nov 2021 • Julius Kunze, James Townsend, David Barber
We propose a new, more general approach to the design of stochastic gradient-based optimization methods for machine learning.
1 code implementation • 15 Jul 2021 • Daniel Severo, James Townsend, Ashish Khisti, Alireza Makhzani, Karen Ullrich
Current methods which compress multisets at an optimal rate have computational complexity that scales linearly with alphabet size, making them too slow to be practical in many real-world settings.
1 code implementation • 21 Apr 2021 • James Townsend
We develop a simple and elegant method for lossless compression using latent variable models, which we call 'bits back with asymmetric numeral systems' (BB-ANS).
1 code implementation • ICLR Workshop Neural_Compression 2021 • James Townsend, Iain Murray
We generalize the 'bits back with ANS' method to time-series models with a latent Markov structure.
1 code implementation • ICLR Workshop Neural_Compression 2021 • Yangjun Ruan, Karen Ullrich, Daniel Severo, James Townsend, Ashish Khisti, Arnaud Doucet, Alireza Makhzani, Chris J. Maddison
Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space.
1 code implementation • 24 Jan 2020 • James Townsend
This paper is intended to be an accessible introduction to the range variant of Asymmetric Numeral Systems (rANS).
1 code implementation • ICLR 2020 • James Townsend, Thomas Bird, Julius Kunze, David Barber
We make the following striking observation: fully convolutional VAE models trained on 32x32 ImageNet can generalize well, not just to 64x64 but also to far larger photographs, with no changes to the model.
2 code implementations • ICLR 2019 • James Townsend, Tom Bird, David Barber
Deep latent variable models have seen recent success in many data domains.
1 code implementation • 10 Mar 2016 • James Townsend, Niklas Koep, Sebastian Weichwald
Optimization on manifolds is a class of methods for optimization of an objective function, subject to constraints which are smooth, in the sense that the set of points which satisfy the constraints admits the structure of a differentiable manifold.
no code implementations • 25 Feb 2015 • Mark Herbster, Paul Rubenstein, James Townsend
Given a set $X$ and a function $h:X\longrightarrow\{0, 1\}$ which labels each element of $X$ with either $0$ or $1$, we may define a function $h^{(s)}$ to measure the similarity of pairs of points in $X$ according to $h$.