Search Results for author: Nicholas P Baskerville

Found 5 papers, 2 papers with code

Random matrix theory and the loss surfaces of neural networks

no code implementations3 Jun 2023 Nicholas P Baskerville

Informed by the historical applications of random matrix theory in physics and elsewhere, we establish the presence of local random matrix universality in real neural networks and then utilise this as a modeling assumption to derive powerful and novel results about the Hessians of neural network loss surfaces and their spectra.

A novel sampler for Gauss-Hermite determinantal point processes with application to Monte Carlo integration

no code implementations15 Mar 2022 Nicholas P Baskerville

Determinantal points processes are a promising but relatively under-developed tool in machine learning and statistical modelling, being the canonical statistical example of distributions with repulsion.

BIG-bench Machine Learning Point Processes

Appearance of Random Matrix Theory in Deep Learning

1 code implementation12 Feb 2021 Nicholas P Baskerville, Diego Granziol, Jonathan P Keating

We further investigate the importance of the true loss surface in neural networks and find, in contrast to previous work, that the exponential hardness of locating the global minimum has practical consequences for achieving state of the art performance.

A spin-glass model for the loss surfaces of generative adversarial networks

1 code implementation7 Jan 2021 Nicholas P Baskerville, Jonathan P Keating, Francesco Mezzadri, Joseph Najnudel

Our model consists of two interacting spin glasses, and we conduct an extensive theoretical analysis of the complexity of the model's critical points using techniques from Random Matrix Theory.

Cannot find the paper you are looking for? You can Submit a new open access paper.