Search Results for author: David Pfau

Found 17 papers, 10 papers with code

Natural Quantum Monte Carlo Computation of Excited States

2 code implementations31 Aug 2023 David Pfau, Simon Axelrod, Halvard Sutterud, Ingrid von Glehn, James S. Spencer

We present a variational Monte Carlo algorithm for estimating the lowest excited states of a quantum system which is a natural generalization of the estimation of ground states.

Variational Monte Carlo

A Self-Attention Ansatz for Ab-initio Quantum Chemistry

3 code implementations24 Nov 2022 Ingrid von Glehn, James S. Spencer, David Pfau

In recent years, deep neural networks like the FermiNet and PauliNet have been used to significantly improve the accuracy of these first-principle calculations, but they lack an attention-like mechanism for gating interactions between electrons.

Ab-initio quantum chemistry with neural-network wavefunctions

no code implementations26 Aug 2022 Jan Hermann, James Spencer, Kenny Choo, Antonio Mezzacapo, W. M. C. Foulkes, David Pfau, Giuseppe Carleo, Frank Noé

Machine learning and specifically deep-learning methods have outperformed human capabilities in many pattern recognition and data processing problems, in game playing, and now also play an increasingly important role in scientific discovery.

Quantization

Integrable Nonparametric Flows

no code implementations3 Dec 2020 David Pfau, Danilo Rezende

This reverses the conventional task of normalizing flows -- rather than being given samples from a unknown target distribution and learning a flow that approximates the distribution, we are given a perturbation to an initial distribution and aim to reconstruct a flow that would generate samples from the known perturbed distribution.

Better, Faster Fermionic Neural Networks

2 code implementations13 Nov 2020 James S. Spencer, David Pfau, Aleksandar Botev, W. M. C. Foulkes

The Fermionic Neural Network (FermiNet) is a recently-developed neural network architecture that can be used as a wavefunction Ansatz for many-electron systems, and has already demonstrated high accuracy on small systems.

Disentangling by Subspace Diffusion

1 code implementation NeurIPS 2020 David Pfau, Irina Higgins, Aleksandar Botev, Sébastien Racanière

We present a novel nonparametric algorithm for symmetry-based disentangling of data manifolds, the Geometric Manifold Component Estimator (GEOMANCER).

Metric Learning Representation Learning

Ab-Initio Solution of the Many-Electron Schrödinger Equation with Deep Neural Networks

1 code implementation5 Sep 2019 David Pfau, James S. Spencer, Alexander G. de G. Matthews, W. M. C. Foulkes

Here we introduce a novel deep learning architecture, the Fermionic Neural Network, as a powerful wavefunction Ansatz for many-electron systems.

Towards a Definition of Disentangled Representations

1 code implementation5 Dec 2018 Irina Higgins, David Amos, David Pfau, Sebastien Racaniere, Loic Matthey, Danilo Rezende, Alexander Lerchner

Here we propose that a principled solution to characterising disentangled representations can be found by focusing on the transformation properties of the world.

Representation Learning

Unrolled Generative Adversarial Networks

9 code implementations7 Nov 2016 Luke Metz, Ben Poole, David Pfau, Jascha Sohl-Dickstein

We introduce a method to stabilize Generative Adversarial Networks (GANs) by defining the generator objective with respect to an unrolled optimization of the discriminator.

Connecting Generative Adversarial Networks and Actor-Critic Methods

no code implementations6 Oct 2016 David Pfau, Oriol Vinyals

Both generative adversarial networks (GAN) in unsupervised learning and actor-critic methods in reinforcement learning (RL) have gained a reputation for being difficult to optimize.

Reinforcement Learning (RL)

A structured matrix factorization framework for large scale calcium imaging data analysis

11 code implementations9 Sep 2014 Eftychios A. Pnevmatikakis, Yuanjun Gao, Daniel Soudry, David Pfau, Clay Lacefield, Kira Poskanzer, Randy Bruno, Rafael Yuste, Liam Paninski

We present a structured matrix factorization approach to analyzing calcium imaging recordings of large neuronal ensembles.

Neurons and Cognition Quantitative Methods Applications

Robust learning of low-dimensional dynamics from large neural ensembles

no code implementations NeurIPS 2013 David Pfau, Eftychios A. Pnevmatikakis, Liam Paninski

We show on model data that the parameters of latent linear dynamical systems can be recovered, and that even if the dynamics are not stationary we can still recover the true latent subspace.

Dimensionality Reduction

Probabilistic Deterministic Infinite Automata

no code implementations NeurIPS 2010 David Pfau, Nicholas Bartlett, Frank Wood

We suggest that our method for averaging over PDFAs is a novel approach to predictive distribution smoothing.

Cannot find the paper you are looking for? You can Submit a new open access paper.