no code implementations • 7 Feb 2024 • Wei Qiao, Tushar Dogra, Otilia Stretcu, Yu-Han Lyu, Tiantian Fang, Dongjin Kwon, Chun-Ta Lu, Enming Luo, YuAn Wang, Chih-Chun Chia, Ariel Fuxman, Fangzhou Wang, Ranjay Krishna, Mehmet Tek
This study proposes a method for scaling up LLM reviews for content moderation in Google Ads.
1 code implementation • 27 Nov 2022 • Tiantian Fang, Ruoyu Sun, Alex Schwing
In contrast, we propose a Discriminator gradIent Gap regularized GAN (DigGAN) formulation which can be added to any existing GAN.
no code implementations • 1 Jan 2021 • Tiantian Fang, Alex Schwing, Ruoyu Sun
We use this PC-layer in two ways: 1) fixed preconditioning (FPC) adds a fixed PC-layer to all layers, and 2) adaptive preconditioning (APC) adaptively controls the strength of preconditioning.
1 code implementation • NeurIPS 2020 • Ruoyu Sun, Tiantian Fang, Alex Schwing
We also perform experiments to support our theory that RpGAN has a better landscape than separable-GAN.
1 code implementation • NeurIPS 2019 • Tiantian Fang, Alexander G. Schwing
Inferring the most likely configuration for a subset of variables of a joint distribution given the remaining ones - which we refer to as co-generation - is an important challenge that is computationally demanding for all but the simplest settings.
no code implementations • 25 Sep 2019 • Ruoyu Sun, Tiantian Fang, Alex Schwing
In this work, we perform a global analysis of GANs from two perspectives: the global landscape of the outer-optimization problem and the global behavior of the gradient descent dynamics.