no code implementations • 14 Feb 2024 • Oliver Broadrick, Honghua Zhang, Guy Van Den Broeck
Probabilistic circuits compute multilinear polynomials that represent multivariate probability distributions.
1 code implementation • 15 Apr 2023 • Honghua Zhang, Meihua Dang, Nanyun Peng, Guy Van Den Broeck
To overcome this challenge, we propose to use tractable probabilistic models (TPMs) to impose lexical constraints in autoregressive text generation models, which we refer to as GeLaTo (Generating Language with Tractable Constraints).
1 code implementation • 27 Feb 2023 • Nikil Roashan Selvam, Honghua Zhang, Guy Van Den Broeck
We show that it is possible to parameterize this Mixture of All Trees (MoAT) model compactly (using a polynomial-size representation) in a way that allows for tractable likelihood computation and optimization via stochastic gradient descent.
no code implementations • 10 Oct 2022 • Anji Liu, Honghua Zhang, Guy Van Den Broeck
We propose to overcome such bottleneck by latent variable distillation: we leverage the less tractable but more expressive deep generative models to provide extra supervision over the latent variables of PCs.
1 code implementation • 23 May 2022 • Honghua Zhang, Liunian Harold Li, Tao Meng, Kai-Wei Chang, Guy Van Den Broeck
Logical reasoning is needed in a wide range of NLP tasks.
1 code implementation • 19 Feb 2021 • Honghua Zhang, Brendan Juba, Guy Van Den Broeck
Generating functions, which are widely used in combinatorics and probability theory, encode function values into the coefficients of a polynomial.
no code implementations • 26 Jun 2020 • Honghua Zhang, Steven Holtzen, Guy Van Den Broeck
Central to this effort is the development of tractable probabilistic models (TPMs): models whose structure guarantees efficient probabilistic inference algorithms.