no code implementations • 13 Jan 2023 • Noor Fathima Ghouse, Jens Petersen, Auke Wiggers, Tianlin Xu, Guillaume Sautière
Diffusion probabilistic models have recently achieved remarkable success in generating high quality image and video data.
1 code implementation • 30 Sep 2021 • Konstantin Klemmer, Tianlin Xu, Beatrice Acciaio, Daniel B. Neill
In this study, we propose a novel loss objective combined with COT-GAN based on an autoregressive embedding to reinforce the learning of spatio-temporal dynamics.
1 code implementation • 10 Jun 2021 • Tianlin Xu, Beatrice Acciaio
The resulting kernel conditional COT-GAN algorithm is illustrated with an application for video prediction.
no code implementations • 26 Apr 2021 • Konstantin Klemmer, Sudipan Saha, Matthias Kahl, Tianlin Xu, Xiao Xiang Zhu
Deep generative models are increasingly used to gain insights in the geospatial data domain, e. g., for climate data.
1 code implementation • NeurIPS 2020 • Tianlin Xu, Li K. Wenliang, Michael Munn, Beatrice Acciaio
We introduce COT-GAN, an adversarial algorithm to train implicit generative models optimized for producing sequential data.
1 code implementation • 3 Jun 2020 • Chengchun Shi, Tianlin Xu, Wicher Bergsma, Lexin Li
In this article, we study the problem of high-dimensional conditional independence testing, a key building block in statistics and machine learning.
no code implementations • 27 Jul 2019 • Mingtian Zhang, Thomas Bird, Raza Habib, Tianlin Xu, David Barber
Probabilistic models are often trained by maximum likelihood, which corresponds to minimizing a specific f-divergence between the model and data distribution.
no code implementations • 27 Sep 2018 • Mingtian Zhang, Thomas Bird, Raza Habib, Tianlin Xu, David Barber
Probabilistic models are often trained by maximum likelihood, which corresponds to minimizing a specific form of f-divergence between the model and data distribution.