Search Results for author: Alexander G. de G. Matthews

Found 10 papers, 4 papers with code

Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks

1 code implementation5 Sep 2019 David Pfau, James S. Spencer, Alexander G. de G. Matthews, W. M. C. Foulkes

Here we introduce a novel deep learning architecture, the Fermionic Neural Network, as a powerful wavefunction Ansatz for many-electron systems.

Functional Regularisation for Continual Learning with Gaussian Processes

1 code implementation ICLR 2020 Michalis K. Titsias, Jonathan Schwarz, Alexander G. de G. Matthews, Razvan Pascanu, Yee Whye Teh

We introduce a framework for Continual Learning (CL) based on Bayesian inference over the function space rather than the parameters of a deep neural network.

Bayesian Inference Continual Learning +2

Variational Bayesian dropout: pitfalls and fixes

no code implementations ICML 2018 Jiri Hron, Alexander G. de G. Matthews, Zoubin Ghahramani

Dropout, a stochastic regularisation technique for training of neural networks, has recently been reinterpreted as a specific type of approximate inference algorithm for Bayesian neural networks.

Gaussian Process Behaviour in Wide Deep Neural Networks

2 code implementations ICLR 2018 Alexander G. de G. Matthews, Mark Rowland, Jiri Hron, Richard E. Turner, Zoubin Ghahramani

Whilst deep neural networks have shown great empirical success, there is still much work to be done to understand their theoretical properties.

Gaussian Processes

Variational Gaussian Dropout is not Bayesian

no code implementations8 Nov 2017 Jiri Hron, Alexander G. de G. Matthews, Zoubin Ghahramani

Gaussian multiplicative noise is commonly used as a stochastic regularisation technique in training of deterministic neural networks.

Bayesian Inference

Adversarial Examples, Uncertainty, and Transfer Testing Robustness in Gaussian Process Hybrid Deep Networks

no code implementations8 Jul 2017 John Bradshaw, Alexander G. de G. Matthews, Zoubin Ghahramani

However, they often do not capture their own uncertainties well making them less robust in the real world as they overconfidently extrapolate and do not notice domain shift.

Gaussian Processes

MCMC for Variationally Sparse Gaussian Processes

no code implementations NeurIPS 2015 James Hensman, Alexander G. de G. Matthews, Maurizio Filippone, Zoubin Ghahramani

This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form.

Gaussian Processes

On Sparse variational methods and the Kullback-Leibler divergence between stochastic processes

no code implementations27 Apr 2015 Alexander G. de G. Matthews, James Hensman, Richard E. Turner, Zoubin Ghahramani

We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model.

Variational Inference

Classification using log Gaussian Cox processes

no code implementations16 May 2014 Alexander G. de G. Matthews, Zoubin Ghahramani

McCullagh and Yang (2006) suggest a family of classification algorithms based on Cox processes.

Classification General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.