Search Results for author: Priyank Jaini

Found 15 papers, 8 papers with code

Intriguing properties of generative classifiers

1 code implementation28 Sep 2023 Priyank Jaini, Kevin Clark, Robert Geirhos

What is the best paradigm to recognize objects -- discriminative inference (fast but potentially prone to shortcut learning) or using a generative model (slow but potentially more robust)?

Object Recognition

Text-to-Image Diffusion Models are Zero-Shot Classifiers

no code implementations27 Mar 2023 Kevin Clark, Priyank Jaini

The key idea is using a diffusion model's ability to denoise a noised image given a text description of a label as a proxy for that label's likelihood.

Attribute Contrastive Learning +2

Stochastic Optimal Control for Collective Variable Free Sampling of Molecular Transition Paths

1 code implementation NeurIPS 2023 Lars Holdijk, Yuanqi Du, Ferry Hooft, Priyank Jaini, Bernd Ensing, Max Welling

We consider the problem of sampling transition paths between two given metastable states of a molecular system, e. g. a folded and unfolded protein or products and reactants of a chemical reaction.

Dimensionality Reduction

Particle Dynamics for Learning EBMs

1 code implementation26 Nov 2021 Kirill Neklyudov, Priyank Jaini, Max Welling

We accomplish this by viewing the evolution of the modeling distribution as (i) the evolution of the energy function, and (ii) the evolution of the samples from this distribution along some vector field.

Learning Equivariant Energy Based Models with Equivariant Stein Variational Gradient Descent

no code implementations NeurIPS 2021 Priyank Jaini, Lars Holdijk, Max Welling

We focus on the problem of efficient sampling and learning of probability densities by incorporating symmetries in probabilistic models.

Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC

1 code implementation4 Feb 2021 Priyank Jaini, Didrik Nielsen, Max Welling

Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.

Argmax Flows: Learning Categorical Distributions with Normalizing Flows

no code implementations pproximateinference AABI Symposium 2021 Emiel Hoogeboom, Didrik Nielsen, Priyank Jaini, Patrick Forré, Max Welling

This paper introduces a new method to define and train continuous distributions such as normalizing flows directly on categorical data, for example text and image segmentation.

Image Segmentation Semantic Segmentation

Self Normalizing Flows

1 code implementation14 Nov 2020 T. Anderson Keller, Jorn W. T. Peters, Priyank Jaini, Emiel Hoogeboom, Patrick Forré, Max Welling

Efficient gradient computation of the Jacobian determinant term is a core problem in many machine learning settings, and especially so in the normalizing flow framework.

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

3 code implementations NeurIPS 2020 Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions.

A Positivstellensatz for Conditional SAGE Signomials

no code implementations8 Mar 2020 Allen Houze Wang, Priyank Jaini, Yao-Liang Yu, Pascal Poupart

Recently, the conditional SAGE certificate has been proposed as a sufficient condition for signomial positivity over a convex set.

Tails of Lipschitz Triangular Flows

no code implementations ICML 2020 Priyank Jaini, Ivan Kobyzev, Yao-Liang Yu, Marcus Brubaker

We investigate the ability of popular flow based methods to capture tail-properties of a target density by studying the increasing triangular maps used in these flow methods acting on a tractable source density.

Sum-of-Squares Polynomial Flow

2 code implementations7 May 2019 Priyank Jaini, Kira A. Selby, Yao-Liang Yu

Triangular map is a recent construct in probability theory that allows one to transform any source probability density function to any target density function.

Density Estimation

Deep Homogeneous Mixture Models: Representation, Separation, and Approximation

no code implementations NeurIPS 2018 Priyank Jaini, Pascal Poupart, Yao-Liang Yu

At their core, many unsupervised learning models provide a compact representation of homogeneous density mixtures, but their similarities and differences are not always clearly understood.

Density Estimation

Online and Distributed learning of Gaussian mixture models by Bayesian Moment Matching

no code implementations19 Sep 2016 Priyank Jaini, Pascal Poupart

The Gaussian mixture model is a classic technique for clustering and data modeling that is used in numerous applications.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.