no code implementations • 16 May 2022 • Feynman Liang, Liam Hodgkinson, Michael W. Mahoney
While fat-tailed densities commonly arise as posterior and marginal distributions in robust models and scale mixtures, they present challenges when Gaussian-based variational inference fails to capture tail decay accurately.
1 code implementation • 23 Oct 2020 • Feynman Liang, Nimar Arora, Nazanin Tehrani, Yucen Li, Michael Tingley, Erik Meijer
In order to construct accurate proposers for Metropolis-Hastings Markov Chain Monte Carlo, we integrate ideas from probabilistic graphical models and neural networks in an open-source framework we call Lightweight Inference Compilation (LIC).
no code implementations • NeurIPS 2020 • Michał Dereziński, Feynman Liang, Zhenyu Liao, Michael W. Mahoney
It is often desirable to reduce the dimensionality of a large dataset by projecting it onto a low-dimensional subspace.
no code implementations • NeurIPS 2020 • Michał Dereziński, Feynman Liang, Michael W. Mahoney
We provide the first exact non-asymptotic expressions for double descent of the minimum norm linear estimator.
1 code implementation • 10 Jun 2019 • Michał Dereziński, Feynman Liang, Michael W. Mahoney
In experimental design, we are given $n$ vectors in $d$ dimensions, and our goal is to select $k\ll n$ of them to perform expensive measurements, e. g., to obtain labels/responses, for a linear regression task.