no code implementations • 19 Apr 2024 • Grigory Bartosh, Dmitry Vetrov, Christian A. Naesseth
To address these limitations, we introduce Neural Flow Diffusion Models (NFDM), a novel framework that enhances diffusion models by supporting a broader range of forward processes beyond the fixed linear Gaussian.
no code implementations • 14 Mar 2024 • Heiko Zimmermann, Christian A. Naesseth, Jan-Willem van de Meent
We present variational inference with sequential sample-average approximation (VISA), a method for approximate inference in computationally intensive models, such as those based on numerical simulations.
no code implementations • 12 Oct 2023 • Grigory Bartosh, Dmitry Vetrov, Christian A. Naesseth
In this paper, we present Neural Diffusion Models (NDMs), a generalization of conventional diffusion models that enables defining and learning time-dependent non-linear transformations of data.
no code implementations • 24 Oct 2022 • Teodora Pandeva, Tim Bakker, Christian A. Naesseth, Patrick Forré
Compared to $p$-values-based tests, tests with E-values have finite sample guarantees for the type I error.
no code implementations • 14 Oct 2022 • Heiko Zimmermann, Fredrik Lindsten, Jan-Willem van de Meent, Christian A. Naesseth
Generative flow networks (GFNs) are a class of models for sequential sampling of composite objects, which approximate a target distribution that is defined in terms of an energy function or a reward.
1 code implementation • 3 Feb 2022 • Liyi Zhang, David M. Blei, Christian A. Naesseth
Variational inference often minimizes the "reverse" Kullbeck-Leibler (KL) KL(q||p) from the approximate distribution q to the posterior p. Recent work studies the "forward" KL KL(p||q), which unlike reverse KL does not lead to variational approximations that underestimate uncertainty.
1 code implementation • 31 May 2021 • Antonio Khalil Moretti, Liyi Zhang, Christian A. Naesseth, Hadiah Venner, David Blei, Itsik Pe'er
Bayesian phylogenetic inference is often conducted via local or sequential search over topologies and branch lengths using algorithms such as random-walk Markov chain Monte Carlo (MCMC) or Combinatorial Sequential Monte Carlo (CSMC).
no code implementations • NeurIPS 2020 • Christian A. Naesseth, Fredrik Lindsten, David Blei
Modern variational inference (VI) uses stochastic gradients to avoid intractable expectations, enabling large-scale probabilistic inference in complex models.
no code implementations • 12 Mar 2019 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations.
1 code implementation • 31 May 2017 • Christian A. Naesseth, Scott W. Linderman, Rajesh Ranganath, David M. Blei
The success of variational approaches depends on (i) formulating a flexible parametric family of distributions, and (ii) optimizing the parameters to find the member of this family that most closely approximates the exact posterior.
no code implementations • 29 Dec 2016 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
Sequential Monte Carlo (SMC) methods comprise one of the most successful approaches to approximate Bayesian filtering.
2 code implementations • 18 Oct 2016 • Christian A. Naesseth, Francisco J. R. Ruiz, Scott W. Linderman, David M. Blei
Variational inference using the reparameterization trick has enabled large-scale approximate Bayesian inference in complex probabilistic models, leveraging stochastic optimization to sidestep intractable expectations.
1 code implementation • 16 Feb 2016 • Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.
no code implementations • 20 Mar 2015 • Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth, Andreas Svensson, Liang Dai
One of the key challenges in identifying nonlinear and possibly non-Gaussian state space models (SSMs) is the intractability of estimating the system state.
1 code implementation • 9 Feb 2015 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm.
3 code implementations • 19 Jun 2014 • Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston, Alexandre Bouchard-Côté
We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models.
no code implementations • NeurIPS 2014 • Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön
We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM).