no code implementations • 26 Jan 2024 • Yeachan Park, Geonho Hwang, Wonyeol Lee, Sejun Park
In this work, we analyze the expressive power of neural networks under a more realistic setup: when we use floating-point numbers and operations.
no code implementations • 31 Jan 2023 • Wonyeol Lee, Rahul Sharma, Alex Aiken
Hence, it is important to use a precision assignment -- a mapping from all tensors (arising in training) to precision levels (high or low) -- that keeps most of the tensors in low precision and leads to sufficiently accurate models.
no code implementations • 31 Jan 2023 • Wonyeol Lee, Sejun Park, Alex Aiken
For a neural network with bias parameters, we first prove that the incorrect set is always empty.
1 code implementation • 22 Aug 2022 • Wonyeol Lee, Xavier Rival, Hongseok Yang
We present a static analysis for discovering differentiable or more generally smooth parts of a given probabilistic program, and show how the analysis can be used to improve the pathwise gradient estimator, one of the most popular methods for posterior inference and model learning.
no code implementations • NeurIPS 2020 • Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang
For these PAP functions, we propose a new type of derivatives, called intensional derivatives, and prove that these derivatives always exist and coincide with standard derivatives for almost all inputs.
no code implementations • 22 Nov 2019 • Hyoungjin Lim, Gwonsoo Che, Wonyeol Lee, Hongseok Yang
We present an algorithm for marginalising changepoints in time-series models that assume a fixed number of unknown changepoints.
1 code implementation • 20 Jul 2019 • Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang
In this paper, we analyse one of the most fundamental and versatile variational inference algorithms, called score estimator, using tools from denotational semantics and program analysis.
1 code implementation • NeurIPS 2018 • Wonyeol Lee, Hangyeol Yu, Hongseok Yang
We tackle the challenge by generalizing the reparameterization trick, one of the most effective techniques for addressing the variance issue for differentiable models, so that the trick works for non-differentiable models as well.