no code implementations • 23 Apr 2024 • Thomas A. Archbold, Ieva Kazlauskaite, Fehmi Cirak
The assumed prior probability density of the surrogate is a Gaussian process.
no code implementations • 26 Jan 2023 • Arnaud Vadeboncoeur, Ieva Kazlauskaite, Yanni Papandreou, Fehmi Cirak, Mark Girolami, Ömer Deniz Akyildiz
We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes.
no code implementations • 26 Nov 2022 • Mala Virdee, Markus Kaiser, Emily Shuckburgh, Carl Henrik Ek, Ieva Kazlauskaite
Adaptation-relevant predictions of climate change are often derived by combining climate model simulations in a multi-model ensemble.
1 code implementation • 29 Oct 2022 • Aditya Ravuri, Tom R. Andersson, Ieva Kazlauskaite, Will Tebbutt, Richard E. Turner, J. Scott Hosking, Neil D. Lawrence, Markus Kaiser
Ice cores record crucial information about past climate.
no code implementations • 9 Aug 2022 • Arnaud Vadeboncoeur, Ömer Deniz Akyildiz, Ieva Kazlauskaite, Mark Girolami, Fehmi Cirak
In the posited probabilistic model, both the forward and inverse maps are approximated as Gaussian distributions with a mean and covariance parameterized by deep neural networks.
no code implementations • 29 Oct 2021 • Olga Mikheeva, Ieva Kazlauskaite, Adam Hartshorne, Hedvig Kjellström, Carl Henrik Ek, Neill D. F. Campbell
Building on the previous work by Kazlauskaiteet al. [2019], we include a separate monotonic warp of the input data to model temporal misalignment.
no code implementations • 27 Jan 2020 • Olga Mikheeva, Ieva Kazlauskaite, Hedvig Kjellström, Carl Henrik Ek
In this paper, we introduce a method for segmenting time series data using tools from Bayesian nonparametrics.
1 code implementation • 17 Sep 2019 • Ivan Ustyuzhaninov, Ieva Kazlauskaite, Markus Kaiser, Erik Bodin, Neill D. F. Campbell, Carl Henrik Ek
Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations.
no code implementations • ICML 2020 • Erik Bodin, Markus Kaiser, Ieva Kazlauskaite, Zhenwen Dai, Neill D. F. Campbell, Carl Henrik Ek
Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice, this is seldom true for real-world objectives even if noise-free observations can be collected.
1 code implementation • 30 May 2019 • Ivan Ustyuzhaninov, Ieva Kazlauskaite, Carl Henrik Ek, Neill D. F. Campbell
We propose a new framework for imposing monotonicity constraints in a Bayesian nonparametric setting based on numerical solutions of stochastic differential equations.
no code implementations • 26 Nov 2018 • Ieva Kazlauskaite, Ivan Ustyuzhaninov, Carl Henrik Ek, Neill D. F. Campbell
We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM).
1 code implementation • 7 Mar 2018 • Ieva Kazlauskaite, Carl Henrik Ek, Neill D. F. Campbell
We present a model that can automatically learn alignments between high-dimensional data in an unsupervised manner.