Search Results for author: Mingxuan Yi

Found 5 papers, 2 papers with code

Bridging the Gap Between Variational Inference and Wasserstein Gradient Flows

1 code implementation31 Oct 2023 Mingxuan Yi, Song Liu

Variational inference is a technique that approximates a target distribution by optimizing within the parameter space of variational families.

Variational Inference

Minimizing $f$-Divergences by Interpolating Velocity Fields

1 code implementation24 May 2023 Song Liu, Jiahao Yu, Jack Simons, Mingxuan Yi, Mark Beaumont

To perform such movements we need to calculate the corresponding velocity fields which include a density ratio function between these two distributions.

Domain Adaptation Imputation

MonoFlow: Rethinking Divergence GANs via the Perspective of Wasserstein Gradient Flows

no code implementations2 Feb 2023 Mingxuan Yi, Zhanxing Zhu, Song Liu

The conventional understanding of adversarial training in generative adversarial networks (GANs) is that the discriminator is trained to estimate a divergence, and the generator learns to minimize this divergence.

Sliced Wasserstein Variational Inference

no code implementations pproximateinference AABI Symposium 2022 Mingxuan Yi, Song Liu

For example, it is not a proper metric, i. e., it is non-symmetric and does not preserve the triangle inequality.

valid Variational Inference

Posterior Ratio Estimation of Latent Variables

no code implementations15 Feb 2020 Song Liu, Yulong Zhang, Mingxuan Yi, Mladen Kolar

Density Ratio Estimation has attracted attention from the machine learning community due to its ability to compare the underlying distributions of two datasets.

Density Ratio Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.