Search Results for author: Tianshi Cao

Found 11 papers, 4 papers with code

LATTE3D: Large-scale Amortized Text-To-Enhanced3D Synthesis

no code implementations22 Mar 2024 Kevin Xie, Jonathan Lorraine, Tianshi Cao, Jun Gao, James Lucas, Antonio Torralba, Sanja Fidler, Xiaohui Zeng

Recent text-to-3D generation approaches produce impressive 3D results but require time-consuming optimization that can take up to an hour per prompt.

3D Generation Text to 3D

TexFusion: Synthesizing 3D Textures with Text-Guided Image Diffusion Models

no code implementations ICCV 2023 Tianshi Cao, Karsten Kreis, Sanja Fidler, Nicholas Sharp, Kangxue Yin

We present TexFusion (Texture Diffusion), a new method to synthesize textures for given 3D geometries, using large-scale text-guided image diffusion models.

Denoising Texture Synthesis

Differentially Private Diffusion Models

1 code implementation18 Oct 2022 Tim Dockhorn, Tianshi Cao, Arash Vahdat, Karsten Kreis

While modern machine learning models rely on increasingly large training datasets, data is often limited in privacy-sensitive domains.

Image Generation

Scalable Neural Data Server: A Data Recommender for Transfer Learning

no code implementations NeurIPS 2021 Tianshi Cao, Sasha Doubov, David Acuna, Sanja Fidler

NDS uses a mixture of experts trained on data sources to estimate similarity between each source and the downstream task.

Transfer Learning

Don’t Generate Me: Training Differentially Private Generative Models with Sinkhorn Divergence

no code implementations NeurIPS 2021 Tianshi Cao, Alex Bie, Arash Vahdat, Sanja Fidler, Karsten Kreis

Generative models trained with privacy constraints on private data can sidestep this challenge, providing indirect access to private data instead.

Don't Generate Me: Training Differentially Private Generative Models with Sinkhorn Divergence

1 code implementation1 Nov 2021 Tianshi Cao, Alex Bie, Arash Vahdat, Sanja Fidler, Karsten Kreis

Generative models trained with privacy constraints on private data can sidestep this challenge, providing indirect access to private data instead.

Differentially Private Generative Models Through Optimal Transport

no code implementations1 Jan 2021 Tianshi Cao, Alex Bie, Karsten Kreis, Sanja Fidler

Generative models trained with privacy constraints on private data can sidestep this challenge and provide indirect access to the private data instead.

Zero-Shot Compositional Policy Learning via Language Grounding

1 code implementation15 Apr 2020 Tianshi Cao, Jingkang Wang, Yining Zhang, Sivabalan Manivasagam

To facilitate the research on language-guided agents with domain adaption, we propose a novel zero-shot compositional policy learning task, where the environments are characterized as a composition of different attributes.

Descriptive Domain Adaptation +5

A Theoretical Analysis of the Number of Shots in Few-Shot Learning

no code implementations ICLR 2020 Tianshi Cao, Marc Law, Sanja Fidler

We introduce a theoretical analysis of the impact of the shot number on Prototypical Networks, a state-of-the-art few-shot classification method.

Classification Few-Shot Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.