Learning From Unpaired Data: A Variational Bayes Approach

29 Sep 2021  ·  Dihan Zheng, Xiaowen Zhang, Kaisheng Ma, Chenglong Bao ·

Collecting the paired training data is a difficult task in practice, but the unpaired samples broadly exist. Thus, current approaches aim at generating synthesized training data from the unpaired samples by exploring the relationship between the corrupted and clean data. In this work, we propose LUD-VAE, a method to learn the joint probability density function from the data sampled from marginal distributions. Our method is based on the variational inference framework and maximizes the evidence lower bound (ELBO), the lower bound of the joint probability density function. Furthermore, we show that the ELBO is computable without paired samples under the inference invariant assumption. This property provides the mathematical rationale of our approach in the unpaired setting. Finally, we apply our method to the real-world image denoising and super-resolution tasks and train the models using the synthetic data generated by the LUD-VAE. Experimental results on four datasets validate the advantages of our method over other learnable approaches.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods