Search Results for author: Mark van der Wilk

Found 55 papers, 25 papers with code

Recommendations for Baselines and Benchmarking Approximate Gaussian Processes

no code implementations15 Feb 2024 Sebastian W. Ober, Artem Artemev, Marcel Wagenländer, Rudolfs Grobins, Mark van der Wilk

To address this, we make recommendations for comparing GP approximations based on a specification of what a user should expect from a method.

Benchmarking Gaussian Processes

Turbulence: Systematically and Automatically Testing Instruction-Tuned Large Language Models for Code

1 code implementation22 Dec 2023 Shahin Honarvar, Mark van der Wilk, Alastair Donaldson

Thus, from a single question template, it is possible to ask an LLM a $\textit{neighbourhood}$ of very similar programming questions, and assess the correctness of the result returned for each question.

Code Generation

Practical Path-based Bayesian Optimization

no code implementations1 Dec 2023 Jose Pablo Folch, James Odgers, Shiqiang Zhang, Robert M Lee, Behrang Shafei, David Walz, Calvin Tsay, Mark van der Wilk, Ruth Misener

There has been a surge in interest in data-driven experimental design with applications to chemical engineering and drug manufacturing.

Bayesian Optimization Experimental Design

Current Methods for Drug Property Prediction in the Real World

no code implementations25 Jul 2023 Jacob Green, Cecilia Cabrera Diaz, Maximilian A. H. Jakobs, Andrea Dimitracopoulos, Mark van der Wilk, Ryan D. Greenhalgh

However, it remains unclear for practitioners which method or approach is most suitable, as different papers benchmark on different datasets and methods, leading to varying conclusions that are not easily compared.

Decision Making Drug Discovery +3

Stochastic Marginal Likelihood Gradients using Neural Tangent Kernels

1 code implementation6 Jun 2023 Alexander Immer, Tycho F. A. van der Ouderaa, Mark van der Wilk, Gunnar Rätsch, Bernhard Schölkopf

Recent works show that Bayesian model selection with Laplace approximations can allow to optimize such hyperparameters just like standard neural network parameters using gradients and on the training data.

Hyperparameter Optimization Model Selection

Causal Discovery using Bayesian Model Selection

no code implementations5 Jun 2023 Anish Dhir, Mark van der Wilk

With only observational data on two variables, and without other assumptions, it is not possible to infer which one causes the other.

Causal Discovery Model Selection

Actually Sparse Variational Gaussian Processes

1 code implementation11 Apr 2023 Harry Jake Cunningham, Daniel Augusto de Souza, So Takao, Mark van der Wilk, Marc Peter Deisenroth

For large datasets, sparse GPs reduce these demands by conditioning on a small set of inducing variables designed to summarise the data.

Gaussian Processes

Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees

1 code implementation14 Oct 2022 Alexander Terenin, David R. Burt, Artem Artemev, Seth Flaxman, Mark van der Wilk, Carl Edward Rasmussen, Hong Ge

For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.

Bayesian Optimization Decision Making +1

Memory Safe Computations with XLA Compiler

1 code implementation28 Jun 2022 Artem Artemev, Tilman Roeder, Mark van der Wilk

We believe that further focus on removing memory constraints at a compiler level will widen the range of machine learning methods that can be developed in the future.

Relaxing Equivariance Constraints with Non-stationary Continuous Filters

no code implementations14 Apr 2022 Tycho F. A. van der Ouderaa, David W. Romero, Mark van der Wilk

Equivariances provide useful inductive biases in neural network modeling, with the translation equivariance of convolutional neural networks being a canonical example.

Image Classification

Learning Invariant Weights in Neural Networks

no code implementations25 Feb 2022 Tycho F. A. van der Ouderaa, Mark van der Wilk

Assumptions about invariances or symmetries in data can significantly increase the predictive power of statistical models.

Gaussian Processes Translation

Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations

1 code implementation22 Feb 2022 Alexander Immer, Tycho F. A. van der Ouderaa, Gunnar Rätsch, Vincent Fortuin, Mark van der Wilk

We develop a convenient gradient-based method for selecting the data augmentation without validation data during training of a deep neural network.

Data Augmentation Gaussian Processes +1

Improved Inverse-Free Variational Bounds for Sparse Gaussian Processes

no code implementations pproximateinference AABI Symposium 2022 Mark van der Wilk, Artem Artemev, James Hensman

The need for matrix decompositions (inverses) is often named as a major impediment to scaling Gaussian process (GP) models, even in efficient approximations.

Gaussian Processes

Matrix Inversion free variational inference in Conditional Student's T Processes

no code implementations pproximateinference AABI Symposium 2022 Sebastian Popescu, Ben Glocker, Mark van der Wilk

We propose a new variational lower bound for performing inference in sparse Student's T Processes that does not require computationally intensive operations such as matrix inversions or log determinants of matrices.

valid Variational Inference

Barely Biased Learning for Gaussian Process Regression

no code implementations NeurIPS Workshop ICBINB 2021 David R. Burt, Artem Artemev, Mark van der Wilk

We suggest a method that adaptively selects the amount of computation to use when estimating the log marginal likelihood so that the bias of the objective function is guaranteed to be small.

regression

A Bayesian Approach to Invariant Deep Neural Networks

no code implementations20 Jul 2021 Nikolaos Mourdoukoutas, Marco Federici, Georges Pantalos, Mark van der Wilk, Vincent Fortuin

We propose a novel Bayesian neural network architecture that can learn invariances from data alone by inferring a posterior distribution over different weight-sharing schemes.

Data Augmentation

Last Layer Marginal Likelihood for Invariance Learning

1 code implementation14 Jun 2021 Pola Schwöbel, Martin Jørgensen, Sebastian W. Ober, Mark van der Wilk

Computing the marginal likelihood is hard for neural networks, but success with tractable approaches that compute the marginal likelihood for the last layer only raises the question of whether this convenient approach might be employed for learning invariances.

Data Augmentation Gaussian Processes +1

BNNpriors: A library for Bayesian neural network inference with different prior distributions

1 code implementation14 May 2021 Vincent Fortuin, Adrià Garriga-Alonso, Mark van der Wilk, Laurence Aitchison

Bayesian neural networks have shown great promise in many applications where calibrated uncertainty estimates are crucial and can often also lead to a higher predictive performance.

Deep Neural Networks as Point Estimates for Deep Gaussian Processes

no code implementations NeurIPS 2021 Vincent Dutordoir, James Hensman, Mark van der Wilk, Carl Henrik Ek, Zoubin Ghahramani, Nicolas Durrande

This results in models that can either be seen as neural networks with improved uncertainty prediction or deep Gaussian processes with increased prediction accuracy.

Bayesian Inference Gaussian Processes +1

The Promises and Pitfalls of Deep Kernel Learning

no code implementations24 Feb 2021 Sebastian W. Ober, Carl E. Rasmussen, Mark van der Wilk

Through careful experimentation on the UCI, CIFAR-10, and the UTKFace datasets, we find that the overfitting from overparameterized maximum marginal likelihood, in which the model is "somewhat Bayesian", can in certain scenarios be worse than that from not being Bayesian at all.

Gaussian Processes

Tighter Bounds on the Log Marginal Likelihood of Gaussian Process Regression Using Conjugate Gradients

no code implementations16 Feb 2021 Artem Artemev, David R. Burt, Mark van der Wilk

We propose a lower bound on the log marginal likelihood of Gaussian process regression models that can be computed without matrix factorisation of the full kernel matrix.

Gaussian Processes regression

Design of Experiments for Verifying Biomolecular Networks

no code implementations20 Nov 2020 Ruby Sedgwick, John Goertz, Molly Stevens, Ruth Misener, Mark van der Wilk

There is a growing trend in molecular and synthetic biology of using mechanistic (non machine learning) models to design biomolecular networks.

Bayesian Optimization Gaussian Processes

Understanding Variational Inference in Function-Space

2 code implementations pproximateinference AABI Symposium 2021 David R. Burt, Sebastian W. Ober, Adrià Garriga-Alonso, Mark van der Wilk

Then, we propose (featurized) Bayesian linear regression as a benchmark for `function-space' inference methods that directly measures approximation quality.

Bayesian Inference Variational Inference

A Bayesian Perspective on Training Speed and Model Selection

no code implementations NeurIPS 2020 Clare Lyle, Lisa Schut, Binxin Ru, Yarin Gal, Mark van der Wilk

This provides two major insights: first, that a measure of a model's training speed can be used to estimate its marginal likelihood.

Model Selection

Revisiting the Train Loss: an Efficient Performance Estimator for Neural Architecture Search

no code implementations28 Sep 2020 Binxin Ru, Clare Lyle, Lisa Schut, Mark van der Wilk, Yarin Gal

Reliable yet efficient evaluation of generalisation performance of a proposed architecture is crucial to the success of neural architecture search (NAS).

Model Selection Neural Architecture Search

Convergence of Sparse Variational Inference in Gaussian Processes Regression

1 code implementation1 Aug 2020 David R. Burt, Carl Edward Rasmussen, Mark van der Wilk

Gaussian processes are distributions over functions that are versatile and mathematically convenient priors in Bayesian modelling.

Gaussian Processes regression +1

Variational Orthogonal Features

no code implementations23 Jun 2020 David R. Burt, Carl Edward Rasmussen, Mark van der Wilk

We present a construction of features for any stationary prior kernel that allow for computation of an unbiased estimator to the ELBO using $T$ Monte Carlo samples in $\mathcal{O}(\tilde{N}T+M^2T)$ and in $\mathcal{O}(\tilde{N}T+MT)$ with an additional approximation.

Variational Inference

Speedy Performance Estimation for Neural Architecture Search

2 code implementations NeurIPS 2021 Binxin Ru, Clare Lyle, Lisa Schut, Miroslav Fil, Mark van der Wilk, Yarin Gal

Reliable yet efficient evaluation of generalisation performance of a proposed architecture is crucial to the success of neural architecture search (NAS).

Model Selection Neural Architecture Search

On the Benefits of Invariance in Neural Networks

no code implementations1 May 2020 Clare Lyle, Mark van der Wilk, Marta Kwiatkowska, Yarin Gal, Benjamin Bloem-Reddy

Many real world data analysis problems exhibit invariant structure, and models that take advantage of this structure have shown impressive empirical performance, particularly in deep learning.

Data Augmentation

Capsule Networks -- A Probabilistic Perspective

no code implementations7 Apr 2020 Lewis Smith, Lisa Schut, Yarin Gal, Mark van der Wilk

'Capsule' models try to explicitly represent the poses of objects, enforcing a linear relationship between an object's pose and that of its constituent parts.

Object

A Framework for Interdomain and Multioutput Gaussian Processes

1 code implementation2 Mar 2020 Mark van der Wilk, Vincent Dutordoir, ST John, Artem Artemev, Vincent Adam, James Hensman

One obstacle to the use of Gaussian processes (GPs) in large-scale problems, and as a component in deep learning system, is the need for bespoke derivations and implementations for small variations in the model or inference.

Gaussian Processes

Variational Gaussian Process Models without Matrix Inverses

no code implementations pproximateinference AABI Symposium 2019 Mark van der Wilk, ST John, Artem Artemev, James Hensman

We present a variational approximation for a wide range of GP models that does not require a matrix inverse to be performed at each optimisation step.

Scalable Bayesian dynamic covariance modeling with variational Wishart and inverse Wishart processes

1 code implementation NeurIPS 2019 Creighton Heaukulani, Mark van der Wilk

We implement gradient-based variational inference routines for Wishart and inverse Wishart processes, which we apply as Bayesian models for the dynamic, heteroskedastic covariance matrix of a multivariate time series.

Gaussian Processes Time Series +2

Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

1 code implementation13 Jun 2019 Alessandro Davide Ialongo, Mark van der Wilk, James Hensman, Carl Edward Rasmussen

As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms.

Variational Inference

Rates of Convergence for Sparse Variational Gaussian Process Regression

1 code implementation8 Mar 2019 David R. Burt, Carl E. Rasmussen, Mark van der Wilk

Excellent variational approximations to Gaussian process posteriors have been developed which avoid the $\mathcal{O}\left(N^3\right)$ scaling with dataset size $N$.

Continual Learning regression

Bayesian Image Classification with Deep Convolutional Gaussian Processes

no code implementations15 Feb 2019 Vincent Dutordoir, Mark van der Wilk, Artem Artemev, James Hensman

We also demonstrate that our fully Bayesian approach improves on dropout-based Bayesian deep learning methods in terms of uncertainty and marginal likelihood estimates.

Classification Decision Making +5

Non-Factorised Variational Inference in Dynamical Systems

no code implementations14 Dec 2018 Alessandro Davide Ialongo, Mark van der Wilk, James Hensman, Carl Edward Rasmussen

We focus on variational inference in dynamical systems where the discrete time transition function (or evolution rule) is modelled by a Gaussian process.

Variational Inference

Closed-form Inference and Prediction in Gaussian Process State-Space Models

no code implementations10 Dec 2018 Alessandro Davide Ialongo, Mark van der Wilk, Carl Edward Rasmussen

We examine an analytic variational inference scheme for the Gaussian Process State Space Model (GPSSM) - a probabilistic model for system identification and time-series modelling.

Time Series Time Series Analysis +1

Learning Invariances using the Marginal Likelihood

no code implementations NeurIPS 2018 Mark van der Wilk, Matthias Bauer, ST John, James Hensman

Generalising well in supervised learning tasks relies on correctly extrapolating the training data to a large region of the input space.

Data Augmentation Gaussian Processes +2

Convolutional Gaussian Processes

4 code implementations NeurIPS 2017 Mark van der Wilk, Carl Edward Rasmussen, James Hensman

We present a practical way of introducing convolutional structure into Gaussian processes, making them more suited to high-dimensional inputs like images.

Gaussian Processes

Understanding Probabilistic Sparse Gaussian Process Approximations

no code implementations NeurIPS 2016 Matthias Bauer, Mark van der Wilk, Carl Edward Rasmussen

Good sparse approximations are essential for practical inference in Gaussian Processes as the computational cost of exact methods is prohibitive for large datasets.

Gaussian Processes regression

Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

1 code implementation NeurIPS 2014 Yarin Gal, Mark van der Wilk, Carl E. Rasmussen

We show that GP performance improves with increasing amounts of data in regression (on flight data with 2 million records) and latent variable modelling (on MNIST).

Dimensionality Reduction Gaussian Processes +2

Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models - a Gentle Tutorial

no code implementations6 Feb 2014 Yarin Gal, Mark van der Wilk

In this tutorial we explain the inference procedures developed for the sparse Gaussian process (GP) regression and Gaussian process latent variable model (GPLVM).

Gaussian Processes regression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.