Search Results for author: Mitch Hill

Found 9 papers, 5 papers with code

OmniMotionGPT: Animal Motion Generation with Limited Data

no code implementations30 Nov 2023 Zhangsihao Yang, Mingyuan Zhou, Mengyi Shan, Bingbing Wen, Ziwei Xuan, Mitch Hill, Junjie Bai, Guo-Jun Qi, Yalin Wang

Our paper aims to generate diverse and realistic animal motion sequences from textual descriptions, without a large-scale animal text-motion dataset.

Motion Synthesis

Learning Probabilistic Models from Generator Latent Spaces with Hat EBM

1 code implementation29 Oct 2022 Mitch Hill, Erik Nijkamp, Jonathan Mitchell, Bo Pang, Song-Chun Zhu

This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM).

EBM Life Cycle: MCMC Strategies for Synthesis, Defense, and Density Modeling

1 code implementation24 May 2022 Mitch Hill, Jonathan Mitchell, Chu Chen, Yuan Du, Mubarak Shah, Song-Chun Zhu

This work presents strategies to learn an Energy-Based Model (EBM) according to the desired length of its MCMC sampling trajectories.

Adversarial Defense Image Generation +1

Stochastic Security: Adversarial Defense Using Long-Run Dynamics of Energy-Based Models

1 code implementation ICLR 2021 Mitch Hill, Jonathan Mitchell, Song-Chun Zhu

Our contributions are 1) an improved method for training EBM's with realistic long-run MCMC samples, 2) an Expectation-Over-Transformation (EOT) defense that resolves theoretical ambiguities for stochastic defenses and from which the EOT attack naturally follows, and 3) state-of-the-art adversarial defense for naturally-trained classifiers and competitive defense compared to adversarially-trained classifiers on Cifar-10, SVHN, and Cifar-100.

Adversarial Defense Robust classification

On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models

2 code implementations29 Mar 2019 Erik Nijkamp, Mitch Hill, Tian Han, Song-Chun Zhu, Ying Nian Wu

On the other hand, ConvNet potentials learned with non-convergent MCMC do not have a valid steady-state and cannot be considered approximate unnormalized densities of the training data because long-run MCMC samples differ greatly from observed images.

Anatomy

Divergence Triangle for Joint Training of Generator Model, Energy-based Model, and Inference Model

1 code implementation28 Dec 2018 Tian Han, Erik Nijkamp, Xiaolin Fang, Mitch Hill, Song-Chun Zhu, Ying Nian Wu

This paper proposes the divergence triangle as a framework for joint training of generator model, energy-based model and inference model.

Building a Telescope to Look Into High-Dimensional Image Spaces

no code implementations2 Mar 2018 Mitch Hill, Erik Nijkamp, Song-Chun Zhu

However, characterizing a learned probability density to uncover the Hopfield memories of the model, encoded by the structure of the local modes, remains an open challenge.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.