no code implementations • NeurIPS 2023 • Alberto Caron, Xitong Liang, Samuel Livingstone, Jim Griffin
In this paper, we introduce a novel MCMC sampler, PARNI-DAG, for a fully-Bayesian approach to the problem of structure learning under observational data.
no code implementations • 4 Jan 2022 • Jure Vogrinc, Samuel Livingstone, Giacomo Zanella
We derive an optimal choice of noise distribution for the Barker proposal, optimal choice of balancing function under a Gaussian noise distribution, and optimal choice of first-order locally-balanced algorithm among the entire class, which turns out to depend on the specific target distribution.
no code implementations • 17 Dec 2020 • Max Hird, Samuel Livingstone, Giacomo Zanella
We provide a full derivation of the method from first principles, placing it within a wider class of continuous-time Markov jump processes.
Computation Methodology
no code implementations • 29 Jan 2016 • Samuel Livingstone, Michael Betancourt, Simon Byrne, Mark Girolami
We establish general conditions under which Markov chains produced by the Hamiltonian Monte Carlo method will and will not be geometrically ergodic.
2 code implementations • NeurIPS 2015 • Heiko Strathmann, Dino Sejdinovic, Samuel Livingstone, Zoltan Szabo, Arthur Gretton
We propose Kernel Hamiltonian Monte Carlo (KMC), a gradient-free adaptive MCMC algorithm based on Hamiltonian Monte Carlo (HMC).