Push: Concurrent Probabilistic Programming for Bayesian Deep Learning

10 Jun 2023  ·  Daniel Huang, Chris Camaño, Jonathan Tsegaye, Jonathan Austin Gale ·

We introduce a library called Push that takes a probabilistic programming approach to Bayesian deep learning (BDL). This library enables concurrent execution of BDL inference algorithms on multi-GPU hardware for neural network (NN) models. To accomplish this, Push introduces an abstraction that represents an input NN as a particle. Push enables easy creation of particles so that an input NN can be replicated and particles can communicate asynchronously so that a variety of parameter updates can be expressed, including common BDL algorithms. Our hope is that Push lowers the barrier to experimenting with BDL by streamlining the scaling of particles across GPUs. We evaluate the scaling behavior of particles on single-node multi-GPU devices on vision and scientific machine learning (SciML) tasks.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods