1 code implementation • 8 Jun 2021 • Dar Gilboa, Ari Pakman, Thibault Vatter
Probability density models based on deep networks have achieved remarkable success in modeling complex high-dimensional datasets.
Ranked #1 on Density Estimation on UCI POWER
no code implementations • pproximateinference AABI Symposium 2021 • Chen Zeno, Itay Golan, Ari Pakman, Daniel Soudry
Recent works have shown that the predictive accuracy of Bayesian deep learning models exhibit substantial improvements when the posterior is raised to a 1/T power with T<1.
2 code implementations • 29 Oct 2020 • Yueqi Wang, Yoonho Lee, Pallab Basu, Juho Lee, Yee Whye Teh, Liam Paninski, Ari Pakman
While graph neural networks (GNNs) have been successful in encoding graph structures, existing GNN-based methods for community detection are limited by requiring knowledge of the number of communities in advance, in addition to lacking a proper probabilistic formulation to handle uncertainty.
no code implementations • pproximateinference AABI Symposium 2019 • Ari Pakman, Yueqi Wang, Liam Paninski
We introduce a neural architecture to perform amortized approximate Bayesian inference over latent random permutations of two sets of objects.
1 code implementation • NeurIPS Workshop Neuro_AI 2019 • Yueqi Wang, Ari Pakman, Catalin Mitelut, JinHyung Lee, Liam Paninski
We present a novel approach to spike sorting for high-density multielectrode probes using the Neural Clustering Process (NCP), a recently introduced neural architecture that performs scalable amortized approximate Bayesian inference for efficient probabilistic clustering.
5 code implementations • ICML 2020 • Ari Pakman, Yueqi Wang, Catalin Mitelut, JinHyung Lee, Liam Paninski
Probabilistic clustering models (or equivalently, mixture models) are basic building blocks in countless statistical models and involve latent random variables over discrete spaces.
1 code implementation • 24 Nov 2018 • Ari Pakman, Liam Paninski
We develop methods for efficient amortized approximate Bayesian inference over posterior distributions of probabilistic clustering models, such as Dirichlet process mixture models.
1 code implementation • 2 Nov 2017 • Ari Pakman
The Bouncy Particle Sampler is a novel rejection-free non-reversible sampler for differentiable probability distributions over continuous variables.
1 code implementation • ICML 2017 • Ari Pakman, Dar Gilboa, David Carlson, Liam Paninski
We introduce a novel stochastic version of the non-reversible, rejection-free Bouncy Particle Sampler (BPS), a Markov process whose sample trajectories are piecewise linear.
no code implementations • 7 Mar 2016 • David Carlson, Patrick Stinson, Ari Pakman, Liam Paninski
Partition functions of probability distributions are important quantities for model evaluation and comparisons.
3 code implementations • 28 Dec 2015 • Roy Fox, Ari Pakman, Naftali Tishby
We propose G-learning, a new off-policy learning algorithm that regularizes the value estimates by penalizing deterministic policies in the beginning of the learning process.
no code implementations • NeurIPS 2013 • Ben Shababo, Brooks Paige, Ari Pakman, Liam Paninski
We develop an inference and optimal design procedure for recovering synaptic weights in neural microcircuits.
5 code implementations • 27 Nov 2013 • Eftychios A. Pnevmatikakis, Josh Merel, Ari Pakman, Liam Paninski
We present efficient Bayesian methods for extracting neuronal spiking information from calcium imaging data.
Neurons and Cognition Quantitative Methods Applications
1 code implementation • 20 Aug 2012 • Ari Pakman, Liam Paninski
We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof.
Computation Applications