1 code implementation • 22 Aug 2022 • Wonyeol Lee, Xavier Rival, Hongseok Yang
We present a static analysis for discovering differentiable or more generally smooth parts of a given probabilistic program, and show how the analysis can be used to improve the pathwise gradient estimator, one of the most popular methods for posterior inference and model learning.
no code implementations • NeurIPS 2020 • Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang
For these PAP functions, we propose a new type of derivatives, called intensional derivatives, and prove that these derivatives always exist and coincide with standard derivatives for almost all inputs.
1 code implementation • 20 Jul 2019 • Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang
In this paper, we analyse one of the most fundamental and versatile variational inference algorithms, called score estimator, using tools from denotational semantics and program analysis.