2 code implementations • 22 Feb 2023 • Yilun Du, Conor Durkan, Robin Strudel, Joshua B. Tenenbaum, Sander Dieleman, Rob Fergus, Jascha Sohl-Dickstein, Arnaud Doucet, Will Grathwohl
In this work, we build upon these ideas using the score-based interpretation of diffusion models, and explore alternative ways to condition, modify, and reuse diffusion models for tasks involving compositional generation and guidance.
no code implementations • 28 Nov 2022 • Sander Dieleman, Laurent Sartran, Arman Roshannai, Nikolay Savinov, Yaroslav Ganin, Pierre H. Richemond, Arnaud Doucet, Robin Strudel, Chris Dyer, Conor Durkan, Curtis Hawthorne, Rémi Leblond, Will Grathwohl, Jonas Adler
Diffusion models have quickly become the go-to paradigm for generative modelling of perceptual signals (such as images and sound) through iterative refinement.
3 code implementations • NeurIPS 2021 • Yang song, Conor Durkan, Iain Murray, Stefano Ermon
Score-based diffusion models synthesize samples by reversing a stochastic process that diffuses data to noise, and are trained by minimizing a weighted combination of score matching losses.
Ranked #6 on Image Generation on ImageNet 32x32 (bpd metric)
no code implementations • 17 Jul 2020 • Alvaro Tejero-Cantero, Jan Boelts, Michael Deistler, Jan-Matthis Lueckmann, Conor Durkan, Pedro J. Gonçalves, David S. Greenberg, Jakob H. Macke
$\texttt{sbi}$ facilitates inference on black-box simulators for practising scientists and engineers by providing a unified interface to state-of-the-art algorithms together with documentation and tutorials.
5 code implementations • ICML 2020 • Conor Durkan, Iain Murray, George Papamakarios
Likelihood-free methods perform parameter inference in stochastic simulator models where evaluating the likelihood is intractable but sampling synthetic data is possible.
8 code implementations • NeurIPS 2019 • Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios
A normalizing flow models a complex probability density as an invertible transformation of a simple base density.
no code implementations • 5 Jun 2019 • Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios
A normalizing flow models a complex probability density as an invertible transformation of a simple density.
1 code implementation • 11 Apr 2019 • Charlie Nash, Conor Durkan
We propose the Autoregressive Energy Machine, an energy-based model which simultaneously learns an unnormalized density and computes an importance-sampling estimate of the normalizing constant for each conditional in an autoregressive decomposition.
no code implementations • 21 Nov 2018 • Conor Durkan, George Papamakarios, Iain Murray
Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators.
no code implementations • ICLR 2018 • Conor Durkan, Amos Storkey, Harrison Edwards
Such reasoning requires learning disentangled representations of data which are interpretable in isolation, but can also be combined in a new, unseen scenario.