1 code implementation • 9 May 2024 • Gerardo Duran-Martin, Matias Altamirano, Alexander Y. Shestopaloff, Leandro Sánchez-Betancourt, Jeremias Knoblauch, Matt Jones, François-Xavier Briol, Kevin Murphy
We derive a novel, provably robust, and closed-form Bayesian update rule for online filtering in state-space models in the presence of outliers and misspecified measurement models.
1 code implementation • 1 Nov 2023 • Matias Altamirano, François-Xavier Briol, Jeremias Knoblauch
To enable closed form conditioning, a common assumption in Gaussian process (GP) regression is independent and identically distributed Gaussian observation noise.
no code implementations • NeurIPS 2023 • Veit David Wild, Sahra Ghalebikesabi, Dino Sejdinovic, Jeremias Knoblauch
We establish the first mathematically rigorous link between Bayesian, variational Bayesian, and ensemble methods.
1 code implementation • 9 Feb 2023 • Matias Altamirano, François-Xavier Briol, Jeremias Knoblauch
This paper proposes an online, provably robust, and scalable Bayesian approach for changepoint detection.
1 code implementation • 16 Jun 2022 • Takuo Matsubara, Jeremias Knoblauch, François-Xavier Briol, Chris. J. Oates
Discrete state spaces represent a major computational challenge to statistical inference, since the computation of normalisation constants requires summation over large or possibly infinite sets, which can be impractical.
1 code implementation • 9 Feb 2022 • Charita Dellaporta, Jeremias Knoblauch, Theodoros Damoulas, François-Xavier Briol
Simulator-based models are models for which the likelihood is intractable but simulation of synthetic data is possible.
no code implementations • 22 Jan 2022 • Joel Jaskari, Jaakko Sahlsten, Theodoros Damoulas, Jeremias Knoblauch, Simo Särkkä, Leo Kärkkäinen, Kustaa Hietala, Kimmo Kaski
Automatic classification of diabetic retinopathy from retinal images has been widely studied using deep neural networks with impressive results.
1 code implementation • 15 Apr 2021 • Takuo Matsubara, Jeremias Knoblauch, François-Xavier Briol, Chris. J. Oates
Generalised Bayesian inference updates prior beliefs using a loss function, rather than a likelihood, and can therefore be used to confer robustness against possible mis-specification of the likelihood.
1 code implementation • pproximateinference AABI Symposium 2021 • Sebastian M Schmon, Patrick W Cannon, Jeremias Knoblauch
Approximate Bayesian computation (ABC) has emerged as a key method in simulation-based inference, wherein the true model likelihood and posterior are approximated using samples from the simulator.
1 code implementation • 3 Nov 2020 • Juan Maroñas, Oliver Hamelijnck, Jeremias Knoblauch, Theodoros Damoulas
Gaussian Processes (GPs) can be used as flexible, non-parametric function priors.
no code implementations • 26 Oct 2020 • Jeremias Knoblauch, Lara Vomfell
Models of discrete-valued outcomes are easily misspecified if the data exhibit zero-inflation, overdispersion or contamination.
no code implementations • ICML 2020 • Jeremias Knoblauch, Hisham Husain, Tom Diethe
Continual Learning (CL) algorithms incrementally learn a predictor or representation across multiple sequentially observed tasks.
no code implementations • 10 Dec 2019 • Jeremias Knoblauch
This paper investigates Frequentist consistency properties of the posterior distributions constructed via Generalized Variational Inference (GVI).
no code implementations • 4 Apr 2019 • Jeremias Knoblauch
This report provides an in-depth overview over the implications and novelty Generalized Variational Inference (GVI) (Knoblauch et al., 2019) brings to Deep Gaussian Processes (DGPs) (Damianou & Lawrence, 2013).
1 code implementation • 3 Apr 2019 • Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
We advocate an optimization-centric view on and introduce a novel generalization of Bayesian inference.
no code implementations • NeurIPS 2018 • Jeremias Knoblauch, Jack E. Jewson, Theodoros Damoulas
The resulting inference procedure is doubly robust for both the predictive and the changepoint (CP) posterior, with linear time and constant space complexity.
1 code implementation • NeurIPS 2018 • Jeremias Knoblauch, Jack Jewson, Theodoros Damoulas
The resulting inference procedure is doubly robust for both the parameter and the changepoint (CP) posterior, with linear time and constant space complexity.
1 code implementation • ICML 2018 • Jeremias Knoblauch, Theodoros Damoulas
Bayesian On-line Changepoint Detection is extended to on-line model selection and non-stationary spatio-temporal processes.