no code implementations • 26 Mar 2024 • Jodie A. Cochrane, Adrian Wills, Sarah J. Johnson
A challenge for existing MCMC approaches is proposing joint changes in both the tree structure and the decision parameters that result in efficient sampling.
no code implementations • 22 Feb 2021 • Filip de Roos, Carl Jidling, Adrian Wills, Thomas Schön, Philipp Hennig
Machine learning practitioners invest significant manual and computational resources in finding suitable learning rates for optimization algorithms.
no code implementations • 14 Dec 2020 • Jarrad Courts, Johannes Hendriks, Adrian Wills, Thomas Schön, Brett Ninness
In this work, a variational approach is used to provide an assumed density which approximates the desired, intractable, distribution.
no code implementations • 8 Dec 2020 • Jarrad Courts, Adrian Wills, Thomas Schön, Brett Ninness
This paper considers parameter estimation for nonlinear state-space models, which is an important but challenging problem.
no code implementations • 7 Feb 2020 • Jarrad Courts, Adrian Wills, Thomas B. Schön
In this paper, the problem of state estimation, in the context of both filtering and smoothing, for nonlinear state-space models is considered.
1 code implementation • 5 Feb 2020 • Johannes Hendriks, Carl Jidling, Adrian Wills, Thomas Schön
We present a novel approach to modelling and learning vector fields from physical systems using neural networks that explicitly satisfy known linear operator constraints.
no code implementations • 18 Oct 2019 • Nicholas O'Dell, Christopher Renton, Adrian Wills
The method is quite attractive as it requires only a small number of hyperparameters to be trained, and is computationally efficient.
no code implementations • 4 Sep 2019 • Carl Jidling, Johannes Hendriks, Thomas B. Schön, Adrian Wills
Deep kernel learning refers to a Gaussian process that incorporates neural networks to improve the modelling of complex functions.
no code implementations • 3 Sep 2019 • Adrian Wills, Thomas Schön
In this paper we present a novel quasi-Newton algorithm for use in stochastic optimisation.
no code implementations • ICLR 2019 • Adrian Wills, Carl Jidling, Thomas Schon
During recent years there has been an increased interest in stochastic adaptations of limited memory quasi-Newton methods, which compared to pure gradient-based routines can improve the convergence by incorporating second order information.
1 code implementation • 26 Jun 2018 • Johan Dahlin, Adrian Wills, Brett Ninness
Pseudo-marginal Metropolis-Hastings (pmMH) is a versatile algorithm for sampling from target distributions which are not easy to evaluate point-wise.
no code implementations • 12 Feb 2018 • Adrian Wills, Thomas Schön
We provide a numerically robust and fast method capable of exploiting the local geometry when solving large-scale stochastic optimisation problems.
1 code implementation • 4 Jan 2018 • Johan Dahlin, Adrian Wills, Brett Ninness
The computation of Bayesian estimates of system parameters and functions of them on the basis of observed system performance data is a common problem within system identification.
Computation Computational Finance
no code implementations • NeurIPS 2017 • Carl Jidling, Niklas Wahlström, Adrian Wills, Thomas B. Schön
We consider a modification of the covariance function in Gaussian processes to correctly account for known linear constraints.
1 code implementation • 12 Feb 2015 • Manon Kok, Johan Dahlin, Thomas B. Schön, Adrian Wills
Maximum likelihood (ML) estimation using Newton's method in nonlinear state space models (SSMs) is a challenging problem due to the analytical intractability of the log-likelihood and its gradient and Hessian.