Glow: Generative Flow with Invertible 1x1 Convolutions

NeurIPS 2018  ·  Diederik P. Kingma, Prafulla Dhariwal ·

Flow-based generative models (Dinh et al., 2014) are conceptually attractive due to tractability of the exact log-likelihood, tractability of exact latent-variable inference, and parallelizability of both training and synthesis. In this paper we propose Glow, a simple type of generative flow using an invertible 1x1 convolution. Using our method we demonstrate a significant improvement in log-likelihood on standard benchmarks. Perhaps most strikingly, we demonstrate that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images. The code for our model is available at https://github.com/openai/glow

PDF Abstract NeurIPS 2018 PDF NeurIPS 2018 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Generation CelebA 256x256 Glow (Kingma and Dhariwal, 2018) bpd 1.03 # 10
Image Generation CIFAR-10 Glow (Kingma and Dhariwal, 2018) bits/dimension 3.35 # 52
Density Estimation ImageNet 32x32 Glow NLL (bits/dim) 4.09 # 4
Image Generation ImageNet 32x32 Glow (Kingma and Dhariwal, 2018) bpd 4.09 # 21
Image Generation ImageNet 64x64 Glow (Kingma and Dhariwal, 2018) Bits per dim 3.81 # 25

Results from Other Papers


Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Image Generation CelebA-HQ 256x256 GLOW FID 68.93 # 14

Methods