no code implementations • 30 Nov 2023 • Zhangsihao Yang, Mingyuan Zhou, Mengyi Shan, Bingbing Wen, Ziwei Xuan, Mitch Hill, Junjie Bai, Guo-Jun Qi, Yalin Wang
Our paper aims to generate diverse and realistic animal motion sequences from textual descriptions, without a large-scale animal text-motion dataset.
1 code implementation • 29 Oct 2022 • Mitch Hill, Erik Nijkamp, Jonathan Mitchell, Bo Pang, Song-Chun Zhu
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM).
1 code implementation • 24 May 2022 • Mitch Hill, Jonathan Mitchell, Chu Chen, Yuan Du, Mubarak Shah, Song-Chun Zhu
This work presents strategies to learn an Energy-Based Model (EBM) according to the desired length of its MCMC sampling trajectories.
no code implementations • ICLR 2022 • Navid Kardan, Mubarak Shah, Mitch Hill
A supervised learning problem is often formulated using an i. i. d.
1 code implementation • ICLR 2021 • Mitch Hill, Jonathan Mitchell, Song-Chun Zhu
Our contributions are 1) an improved method for training EBM's with realistic long-run MCMC samples, 2) an Expectation-Over-Transformation (EOT) defense that resolves theoretical ambiguities for stochastic defenses and from which the EOT attack naturally follows, and 3) state-of-the-art adversarial defense for naturally-trained classifiers and competitive defense compared to adversarially-trained classifiers on Cifar-10, SVHN, and Cifar-100.
no code implementations • NeurIPS 2019 • Erik Nijkamp, Mitch Hill, Song-Chun Zhu, Ying Nian Wu
We treat this non-convergent short-run MCMC as a learned generator model or a flow model.
2 code implementations • 29 Mar 2019 • Erik Nijkamp, Mitch Hill, Tian Han, Song-Chun Zhu, Ying Nian Wu
On the other hand, ConvNet potentials learned with non-convergent MCMC do not have a valid steady-state and cannot be considered approximate unnormalized densities of the training data because long-run MCMC samples differ greatly from observed images.
1 code implementation • 28 Dec 2018 • Tian Han, Erik Nijkamp, Xiaolin Fang, Mitch Hill, Song-Chun Zhu, Ying Nian Wu
This paper proposes the divergence triangle as a framework for joint training of generator model, energy-based model and inference model.
no code implementations • 2 Mar 2018 • Mitch Hill, Erik Nijkamp, Song-Chun Zhu
However, characterizing a learned probability density to uncover the Hopfield memories of the model, encoded by the structure of the local modes, remains an open challenge.