VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models

ICLR 2021  ยท  Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat ยท

Energy-based models (EBMs) have recently been successful in representing complex distributions of small images. However, sampling from them requires expensive Markov chain Monte Carlo (MCMC) iterations that mix slowly in high dimensional pixel space. Unlike EBMs, variational autoencoders (VAEs) generate samples quickly and are equipped with a latent space that enables fast traversal of the data manifold. However, VAEs tend to assign high probability density to regions in data space outside the actual data distribution and often fail at generating sharp images. In this paper, we propose VAEBM, a symbiotic composition of a VAE and an EBM that offers the best of both worlds. VAEBM captures the overall mode structure of the data distribution using a state-of-the-art VAE and it relies on its EBM component to explicitly exclude non-data-like regions from the model and refine the image samples. Moreover, the VAE component in VAEBM allows us to speed up MCMC updates by reparameterizing them in the VAE's latent space. Our experimental results show that VAEBM outperforms state-of-the-art VAEs and EBMs in generative quality on several benchmark image datasets by a large margin. It can generate high-quality images as large as 256$\times$256 pixels with short MCMC chains. We also demonstrate that VAEBM provides complete mode coverage and performs well in out-of-distribution detection. The source code is available at https://github.com/NVlabs/VAEBM

PDF Abstract ICLR 2021 PDF ICLR 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Generation CelebA-HQ 256x256 VAEBM FID 20.38 # 13
Image Generation CelebA-HQ 64x64 VAEBM FID 5.31 # 2
Image Generation CIFAR-10 VAEBM w/ persistent chain Inception score 8.43 # 46
FID 12.19 # 100
Image Generation CIFAR-10 VAEBM w/o persistent chain Inception score 8.21 # 54
FID 12.26 # 101
Image Generation Stacked MNIST VAEBM FID 12.96 # 1
Inception score 8.15 # 1

Methods


EBM โ€ข VAE