Search Results for author: Conor Durkan

Found 10 papers, 5 papers with code

Reduce, Reuse, Recycle: Compositional Generation with Energy-Based Diffusion Models and MCMC

2 code implementations22 Feb 2023 Yilun Du, Conor Durkan, Robin Strudel, Joshua B. Tenenbaum, Sander Dieleman, Rob Fergus, Jascha Sohl-Dickstein, Arnaud Doucet, Will Grathwohl

In this work, we build upon these ideas using the score-based interpretation of diffusion models, and explore alternative ways to condition, modify, and reuse diffusion models for tasks involving compositional generation and guidance.

Text-to-Image Generation

Continuous diffusion for categorical data

no code implementations28 Nov 2022 Sander Dieleman, Laurent Sartran, Arman Roshannai, Nikolay Savinov, Yaroslav Ganin, Pierre H. Richemond, Arnaud Doucet, Robin Strudel, Chris Dyer, Conor Durkan, Curtis Hawthorne, Rémi Leblond, Will Grathwohl, Jonas Adler

Diffusion models have quickly become the go-to paradigm for generative modelling of perceptual signals (such as images and sound) through iterative refinement.

Language Modelling

Maximum Likelihood Training of Score-Based Diffusion Models

3 code implementations NeurIPS 2021 Yang song, Conor Durkan, Iain Murray, Stefano Ermon

Score-based diffusion models synthesize samples by reversing a stochastic process that diffuses data to noise, and are trained by minimizing a weighted combination of score matching losses.

Ranked #6 on Image Generation on ImageNet 32x32 (bpd metric)

Data Augmentation Image Generation

SBI -- A toolkit for simulation-based inference

no code implementations17 Jul 2020 Alvaro Tejero-Cantero, Jan Boelts, Michael Deistler, Jan-Matthis Lueckmann, Conor Durkan, Pedro J. Gonçalves, David S. Greenberg, Jakob H. Macke

$\texttt{sbi}$ facilitates inference on black-box simulators for practising scientists and engineers by providing a unified interface to state-of-the-art algorithms together with documentation and tutorials.

Bayesian Inference

On Contrastive Learning for Likelihood-free Inference

5 code implementations ICML 2020 Conor Durkan, Iain Murray, George Papamakarios

Likelihood-free methods perform parameter inference in stochastic simulator models where evaluating the likelihood is intractable but sampling synthetic data is possible.

Contrastive Learning

Neural Spline Flows

8 code implementations NeurIPS 2019 Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios

A normalizing flow models a complex probability density as an invertible transformation of a simple base density.

Density Estimation Variational Inference

Cubic-Spline Flows

no code implementations5 Jun 2019 Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios

A normalizing flow models a complex probability density as an invertible transformation of a simple density.

Density Estimation

Autoregressive Energy Machines

1 code implementation11 Apr 2019 Charlie Nash, Conor Durkan

We propose the Autoregressive Energy Machine, an energy-based model which simultaneously learns an unnormalized density and computes an importance-sampling estimate of the normalizing constant for each conditional in an autoregressive decomposition.

Density Estimation valid

Sequential Neural Methods for Likelihood-free Inference

no code implementations21 Nov 2018 Conor Durkan, George Papamakarios, Iain Murray

Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators.

The Context-Aware Learner

no code implementations ICLR 2018 Conor Durkan, Amos Storkey, Harrison Edwards

Such reasoning requires learning disentangled representations of data which are interpretable in isolation, but can also be combined in a new, unseen scenario.

Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.