Probabilistic Programming
87 papers with code • 0 benchmarks • 0 datasets
Probabilistic programming languages are designed to describe probabilistic models and then perform inference in those models. PPLs are closely related to graphical models and Bayesian networks, but are more expressive and flexible.
( Image credit: Michael Betancourt )
Benchmarks
These leaderboards are used to track progress in Probabilistic Programming
Libraries
Use these libraries to find Probabilistic Programming models and implementationsLatest papers with no code
String Diagrams with Factorized Densities
A growing body of research on probabilistic programs and causal models has highlighted the need to reason compositionally about model classes that extend directed graphical models.
Dimensionality Reduction as Probabilistic Inference
Dimensionality reduction (DR) algorithms compress high-dimensional data into a lower dimensional representation while preserving important features of the data.
Probabilistic relations for modelling epistemic and aleatoric uncertainty: semantics and automated reasoning with theorem proving
We demonstrate our work with six examples, including problems in robot localisation, classification in machine learning, and the termination of probabilistic loops.
Neural Probabilistic Logic Programming in Discrete-Continuous Domains
Probabilistic NeSy focuses on integrating neural networks with both logic and probability theory, which additionally allows learning under uncertainty.
Declarative Probabilistic Logic Programming in Discrete-Continuous Domains
The resulting paradigm of probabilistic logic programming (PLP) and its programming languages owes much of its success to a declarative semantics, the so-called distribution semantics.
$ω$PAP Spaces: Reasoning Denotationally About Higher-Order, Recursive Probabilistic and Differentiable Programs
We introduce a new setting, the category of $\omega$PAP spaces, for reasoning denotationally about expressive differentiable and probabilistic programming languages.
Incorporating Expert Opinion on Observable Quantities into Statistical Models -- A General Framework
This article describes an approach to incorporate expert opinion on observable quantities through the use of a loss function which updates a prior belief as opposed to specifying parameters on the priors.
Fast and Correct Gradient-Based Optimisation for Probabilistic Programming via Smoothing
Thus we can prove stochastic gradient descent with the reparameterisation gradient estimator to be correct when applied to the smoothed problem.
When Bioprocess Engineering Meets Machine Learning: A Survey from the Perspective of Automated Bioprocess Development
ML can be seen as a set of tools that contribute to the automation of the whole experimental cycle, including model building and practical planning, thus allowing human experts to focus on the more demanding and overarching cognitive tasks.
Learning and Compositionality: a Unification Attempt via Connectionist Probabilistic Programming
We consider learning and compositionality as the key mechanisms towards simulating human-like intelligence.