Search Results for author: Guy Van Den Broeck

Found 75 papers, 35 papers with code

Prepacking: A Simple Method for Fast Prefilling and Increased Throughput in Large Language Models

1 code implementation15 Apr 2024 Siyan Zhao, Daniel Israel, Guy Van Den Broeck, Aditya Grover

In this work, we highlight the following pitfall of prefilling: for batches containing high-varying prompt lengths, significant computation is wasted by the standard practice of padding sequences to the maximum length.

Polynomial Semantics of Tractable Probabilistic Circuits

no code implementations14 Feb 2024 Oliver Broadrick, Honghua Zhang, Guy Van Den Broeck

Probabilistic circuits compute multilinear polynomials that represent multivariate probability distributions.

Image Inpainting via Tractable Steering of Diffusion Models

no code implementations28 Nov 2023 Anji Liu, Mathias Niepert, Guy Van Den Broeck

In addition to proposing a new framework for constrained image generation, this paper highlights the benefit of more tractable models and motivates the development of expressive TPMs.

Denoising Image Inpainting

A Unified Approach to Count-Based Weakly-Supervised Learning

1 code implementation22 Nov 2023 Vinay Shukla, Zhe Zeng, Kareem Ahmed, Guy Van Den Broeck

In many cases, these weak labels dictate the frequency of each respective class over a set of instances.

Weakly-supervised Learning

Expressive Modeling Is Insufficient for Offline RL: A Tractable Inference Perspective

no code implementations31 Oct 2023 Xuejie Liu, Anji Liu, Guy Van Den Broeck, Yitao Liang

A popular paradigm for offline Reinforcement Learning (RL) tasks is to first fit the offline trajectories to a sequence model, and then prompt the model for actions that lead to high expected return.

Offline RL Reinforcement Learning (RL)

High Dimensional Causal Inference with Variational Backdoor Adjustment

1 code implementation9 Oct 2023 Daniel Israel, Aditya Grover, Guy Van Den Broeck

For example, in medical settings, backdoor adjustment can be used to control for confounding and estimate the effectiveness of a treatment.

Causal Inference Variational Inference

Probabilistically Rewired Message-Passing Neural Networks

1 code implementation3 Oct 2023 Chendi Qian, Andrei Manolache, Kareem Ahmed, Zhe Zeng, Guy Van Den Broeck, Mathias Niepert, Christopher Morris

Message-passing graph neural networks (MPNNs) emerged as powerful tools for processing graph-structured input.

Scaling Integer Arithmetic in Probabilistic Programs

no code implementations25 Jul 2023 William X. Cao, Poorva Garg, Ryan Tjoa, Steven Holtzen, Todd Millstein, Guy Van Den Broeck

Distributions on integers are ubiquitous in probabilistic modeling but remain challenging for many of today's probabilistic programming languages (PPLs).

Probabilistic Programming

Collapsed Inference for Bayesian Deep Learning

1 code implementation NeurIPS 2023 Zhe Zeng, Guy Van Den Broeck

We tackle this challenge by revealing a previously unseen connection between inference on BNNs and volume computation problems.

Tractable Control for Autoregressive Language Generation

1 code implementation15 Apr 2023 Honghua Zhang, Meihua Dang, Nanyun Peng, Guy Van Den Broeck

To overcome this challenge, we propose to use tractable probabilistic models (TPMs) to impose lexical constraints in autoregressive text generation models, which we refer to as GeLaTo (Generating Language with Tractable Constraints).

Text Generation

Semantic Strengthening of Neuro-Symbolic Learning

no code implementations28 Feb 2023 Kareem Ahmed, Kai-Wei Chang, Guy Van Den Broeck

Numerous neuro-symbolic approaches have recently been proposed typically with the goal of adding symbolic knowledge to the output layer of a neural network.

Mixtures of All Trees

1 code implementation27 Feb 2023 Nikil Roashan Selvam, Honghua Zhang, Guy Van Den Broeck

We show that it is possible to parameterize this Mixture of All Trees (MoAT) model compactly (using a polynomial-size representation) in a way that allows for tractable likelihood computation and optimization via stochastic gradient descent.

Density Estimation

Understanding the Distillation Process from Deep Generative Models to Tractable Probabilistic Circuits

no code implementations16 Feb 2023 Xuejie Liu, Anji Liu, Guy Van Den Broeck, Yitao Liang

In this paper, we theoretically and empirically discover that the performance of a PC can exceed that of its teacher model.

Certifying Fairness of Probabilistic Circuits

1 code implementation5 Dec 2022 Nikil Roashan Selvam, Guy Van Den Broeck, YooJung Choi

In this paper, we propose an algorithm to search for discrimination patterns in a general class of probabilistic models, namely probabilistic circuits.

Decision Making Fairness

Sparse Probabilistic Circuits via Pruning and Growing

1 code implementation22 Nov 2022 Meihua Dang, Anji Liu, Guy Van Den Broeck

The growing operation increases model capacity by increasing the size of the latent space.

Model Compression

Scaling Up Probabilistic Circuits by Latent Variable Distillation

no code implementations10 Oct 2022 Anji Liu, Honghua Zhang, Guy Van Den Broeck

We propose to overcome such bottleneck by latent variable distillation: we leverage the less tractable but more expressive deep generative models to provide extra supervision over the latent variables of PCs.

Language Modelling

SIMPLE: A Gradient Estimator for $k$-Subset Sampling

1 code implementation4 Oct 2022 Kareem Ahmed, Zhe Zeng, Mathias Niepert, Guy Van Den Broeck

$k$-subset sampling is ubiquitous in machine learning, enabling regularization and interpretability through sparsity.

Semantic Probabilistic Layers for Neuro-Symbolic Learning

1 code implementation1 Jun 2022 Kareem Ahmed, Stefano Teso, Kai-Wei Chang, Guy Van Den Broeck, Antonio Vergari

We design a predictive layer for structured-output prediction (SOP) that can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints.

Hierarchical Multi-label Classification Logical Reasoning

Neuro-Symbolic Entropy Regularization

no code implementations25 Jan 2022 Kareem Ahmed, Eric Wang, Kai-Wei Chang, Guy Van Den Broeck

We propose a loss, neuro-symbolic entropy regularization, that encourages the model to confidently predict a valid object.

Structured Prediction valid

Lossless Compression with Probabilistic Circuits

1 code implementation ICLR 2022 Anji Liu, Stephan Mandt, Guy Van Den Broeck

To overcome such problems, we establish a new class of tractable lossless compression models that permit efficient encoding and decoding: Probabilistic Circuits (PCs).

Data Compression Image Generation

Solving Marginal MAP Exactly by Probabilistic Circuit Transformations

no code implementations8 Nov 2021 YooJung Choi, Tal Friedman, Guy Van Den Broeck

Probabilistic circuits (PCs) are a class of tractable probabilistic models that allow efficient, often linear-time, inference of queries such as marginals and most probable explanations (MPE).

Decision Making

flip-hoisting: Exploiting Repeated Parameters in Discrete Probabilistic Programs

no code implementations19 Oct 2021 Ellie Y. Cheng, Todd Millstein, Guy Van Den Broeck, Steven Holtzen

Many of today's probabilistic programming languages (PPLs) have brittle inference performance: the performance of the underlying inference algorithm is very sensitive to the precise way in which the probabilistic program is written.

Probabilistic Programming

Towards an Interpretable Latent Space in Structured Models for Video Prediction

no code implementations16 Jul 2021 Rushil Gupta, Vishal Sharma, Yash Jain, Yitao Liang, Guy Van Den Broeck, Parag Singla

We work with models which are object-centric, i. e., explicitly work with object representations, and propagate a loss in the latent space.

Contrastive Learning Inductive Bias +2

Tractable Regularization of Probabilistic Circuits

no code implementations NeurIPS 2021 Anji Liu, Guy Van Den Broeck

Instead, we re-think regularization for PCs and propose two intuitive techniques, data softening and entropy regularization, that both take advantage of PCs' tractability and still have an efficient implementation as a computation graph.

Density Estimation

A Compositional Atlas of Tractable Circuit Operations for Probabilistic Inference

1 code implementation NeurIPS 2021 Antonio Vergari, YooJung Choi, Anji Liu, Stefano Teso, Guy Van Den Broeck

Circuit representations are becoming the lingua franca to express and reason about tractable generative and discriminative models.

Probabilistic Sufficient Explanations

1 code implementation21 May 2021 Eric Wang, Pasha Khosravi, Guy Van Den Broeck

Understanding the behavior of learned classifiers is an important task, and various black-box explanations, logical reasoning approaches, and model-specific methods have been proposed.

Logical Reasoning

Tractable Computation of Expected Kernels

1 code implementation21 Feb 2021 Wenzhe Li, Zhe Zeng, Antonio Vergari, Guy Van Den Broeck

Computing the expectation of kernel functions is a ubiquitous task in machine learning, with applications from classical support vector machines to exploiting kernel embeddings of distributions in probabilistic modeling, statistical inference, causal discovery, and deep learning.

Causal Discovery

Probabilistic Generating Circuits

1 code implementation19 Feb 2021 Honghua Zhang, Brendan Juba, Guy Van Den Broeck

Generating functions, which are widely used in combinatorics and probability theory, encode function values into the coefficients of a polynomial.

Density Estimation Point Processes

Probabilistic Inference with Algebraic Constraints: Theoretical Limits and Practical Approximations

no code implementations NeurIPS 2020 Zhe Zeng, Paolo Morettin, Fanqi Yan, Antonio Vergari, Guy Van Den Broeck

Weighted model integration (WMI) is a framework to perform advanced probabilistic inference on hybrid domains, i. e., on distributions over mixed continuous-discrete random variables and in presence of complex logical and arithmetic constraints.

On the Tractability of SHAP Explanations

no code implementations18 Sep 2020 Guy Van den Broeck, Anton Lykov, Maximilian Schleich, Dan Suciu

First, we consider fully-factorized data distributions, and show that the complexity of computing the SHAP explanation is the same as the complexity of computing the expected value of the model.

BIG-bench Machine Learning

Strudel: Learning Structured-Decomposable Probabilistic Circuits

1 code implementation18 Jul 2020 Meihua Dang, Antonio Vergari, Guy Van Den Broeck

Probabilistic circuits (PCs) represent a probability distribution as a computational graph.

Density Estimation

On the Relationship Between Probabilistic Circuits and Determinantal Point Processes

no code implementations26 Jun 2020 Honghua Zhang, Steven Holtzen, Guy Van Den Broeck

Central to this effort is the development of tractable probabilistic models (TPMs): models whose structure guarantees efficient probabilistic inference algorithms.

Point Processes

On Effective Parallelization of Monte Carlo Tree Search

no code implementations15 Jun 2020 Anji Liu, Yitao Liang, Ji Liu, Guy Van Den Broeck, Jianshu Chen

Second, and more importantly, we demonstrate how the proposed necessary conditions can be adopted to design more effective parallel MCTS algorithms.

Atari Games

Dice: Compiling Discrete Probabilistic Programs for Scalable Inference

1 code implementation18 May 2020 Steven Holtzen, Guy Van Den Broeck, Todd Millstein

This reduction separates the structure of the distribution from its parameters, enabling logical reasoning tools to exploit that structure for probabilistic inference.

Programming Languages

Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits

1 code implementation ICML 2020 Robert Peharz, Steven Lang, Antonio Vergari, Karl Stelzner, Alejandro Molina, Martin Trapp, Guy Van Den Broeck, Kristian Kersting, Zoubin Ghahramani

Probabilistic circuits (PCs) are a promising avenue for probabilistic modeling, as they permit a wide range of exact and efficient inference routines.

Scaling up Hybrid Probabilistic Inference with Logical and Arithmetic Constraints via Message Passing

1 code implementation ICML 2020 Zhe Zeng, Paolo Morettin, Fanqi Yan, Antonio Vergari, Guy Van Den Broeck

Weighted model integration (WMI) is a very appealing framework for probabilistic inference: it allows to express the complex dependencies of real-world problems where variables are both continuous and discrete, via the language of Satisfiability Modulo Theories (SMT), as well as to compute probabilistic queries with complex logical and arithmetic constraints.

Off-Policy Deep Reinforcement Learning with Analogous Disentangled Exploration

1 code implementation25 Feb 2020 Anji Liu, Yitao Liang, Guy Van Den Broeck

Off-policy reinforcement learning (RL) is concerned with learning a rewarding policy by executing another policy that gathers samples of experience.

Continuous Control reinforcement-learning +1

Symbolic Querying of Vector Spaces: Probabilistic Databases Meets Relational Embeddings

no code implementations24 Feb 2020 Tal Friedman, Guy Van Den Broeck

We propose unifying techniques from probabilistic databases and relational embedding models with the goal of performing complex queries on incomplete and uncertain data.

SAM: Squeeze-and-Mimic Networks for Conditional Visual Driving Policy Learning

1 code implementation6 Dec 2019 Albert Zhao, Tong He, Yitao Liang, Haibin Huang, Guy Van Den Broeck, Stefano Soatto

To learn this representation, we train a squeeze network to drive using annotations for the side task as input.

Semantic Segmentation

Towards Hardware-Aware Tractable Learning of Probabilistic Models

1 code implementation NeurIPS 2019 Laura I. Galindez Olascoaga, Wannes Meert, Nimish Shah, Marian Verhelst, Guy Van Den Broeck

We showcase our framework on a mobile activity recognition scenario, and on a variety of benchmark datasets representative of the field of tractable learning and of the applications of interest.

Activity Recognition Edge-computing +1

On Tractable Computation of Expected Predictions

1 code implementation NeurIPS 2019 Pasha Khosravi, YooJung Choi, Yitao Liang, Antonio Vergari, Guy Van Den Broeck

In this paper, we identify a pair of generative and discriminative models that enables tractable computation of expectations, as well as moments of any order, of the latter with respect to the former in case of regression.

Fairness Imputation +1

Hybrid Probabilistic Inference with Logical Constraints: Tractability and Message Passing

no code implementations20 Sep 2019 Zhe Zeng, Fanqi Yan, Paolo Morettin, Antonio Vergari, Guy Van Den Broeck

Weighted model integration (WMI) is a very appealing framework for probabilistic inference: it allows to express the complex dependencies of real-world hybrid scenarios where variables are heterogeneous in nature (both continuous and discrete) via the language of Satisfiability Modulo Theories (SMT); as well as computing probabilistic queries with arbitrarily complex logical constraints.

Learning Fair Naive Bayes Classifiers by Discovering and Eliminating Discrimination Patterns

1 code implementation10 Jun 2019 YooJung Choi, Golnoosh Farnadi, Behrouz Babaki, Guy Van Den Broeck

As machine learning is increasingly used to make real-world decisions, recent research efforts aim to define and ensure fairness in algorithmic decision making.

Decision Making Fairness

Smoothing Structured Decomposable Circuits

1 code implementation NeurIPS 2019 Andy Shih, Guy Van Den Broeck, Paul Beame, Antoine Amarilli

Further, for the important case of All-Marginals, we show a more efficient linear-time algorithm.

Density Estimation

Efficient Search-Based Weighted Model Integration

no code implementations13 Mar 2019 Zhe Zeng, Guy Van Den Broeck

Weighted model integration (WMI) extends Weighted model counting (WMC) to the integration of functions over mixed discrete-continuous domains.

Computational Efficiency Probabilistic Programming

Generating and Sampling Orbits for Lifted Probabilistic Inference

1 code implementation12 Mar 2019 Steven Holtzen, Todd Millstein, Guy Van Den Broeck

A key goal in the design of probabilistic inference algorithms is identifying and exploiting properties of the distribution that make inference tractable.

What to Expect of Classifiers? Reasoning about Logistic Regression with Missing Features

1 code implementation5 Mar 2019 Pasha Khosravi, Yitao Liang, YooJung Choi, Guy Van Den Broeck

While discriminative classifiers often yield strong predictive performance, missing feature values at prediction time can still be a challenge.

Imputation regression

On Constrained Open-World Probabilistic Databases

no code implementations27 Feb 2019 Tal Friedman, Guy Van Den Broeck

Increasing amounts of available data have led to a heightened need for representing large-scale probabilistic knowledge bases.

Learning Logistic Circuits

1 code implementation27 Feb 2019 Yitao Liang, Guy Van Den Broeck

This paper proposes a new classification model called logistic circuits.

General Classification

Scalable Rule Learning in Probabilistic Knowledge Bases

1 code implementation AKBC 2019 Arcchit Jain, Tal Friedman, Ondrej Kuzelka, Guy Van Den Broeck, Luc De Raedt

In this paper, we present SafeLearner -- a scalable solution to probabilistic KB completion that performs probabilistic rule learning using lifted probabilistic inference -- as faster approach instead of grounding.

Approximate Knowledge Compilation by Online Collapsed Importance Sampling

1 code implementation NeurIPS 2018 Tal Friedman, Guy Van Den Broeck

In particular, when the amount of exact inference is equally limited, collapsed compilation is competitive with the state of the art, and outperforms it on several benchmarks.

On Robust Trimming of Bayesian Network Classifiers

1 code implementation29 May 2018 YooJung Choi, Guy Van Den Broeck

To this end, we propose a closeness metric between Bayesian classifiers, called the expected classification agreement (ECA).

Classification General Classification

Domain Recursion for Lifted Inference with Existential Quantifiers

no code implementations24 Jul 2017 Seyed Mehran Kazemi, Angelika Kimmig, Guy Van Den Broeck, David Poole

In this paper, we show that domain recursion can also be applied to models with existential quantifiers.

Probabilistic Program Abstractions

no code implementations28 May 2017 Steven Holtzen, Todd Millstein, Guy Van Den Broeck

Abstraction is a fundamental tool for reasoning about complex systems.

New Liftable Classes for First-Order Probabilistic Inference

no code implementations NeurIPS 2016 Seyed Mehran Kazemi, Angelika Kimmig, Guy Van Den Broeck, David Poole

Statistical relational models provide compact encodings of probabilistic dependencies in relational domains, but result in highly intractable graphical models.

Tractable Learning for Complex Probability Queries

no code implementations NeurIPS 2015 Jessa Bekker, Jesse Davis, Arthur Choi, Adnan Darwiche, Guy Van Den Broeck

We propose a tractable learner that guarantees efficient inference for a broader class of queries.

Symmetric Weighted First-Order Model Counting

no code implementations3 Dec 2014 Paul Beame, Guy Van Den Broeck, Eric Gribkoff, Dan Suciu

For the combined complexity, we prove that, for every fragment FO$^{k}$, $k\geq 2$, the combined complexity of FOMC (or WFOMC) is #P-complete.

Sentence

Lifted Probabilistic Inference for Asymmetric Graphical Models

no code implementations1 Dec 2014 Guy Van den Broeck, Mathias Niepert

Lifted probabilistic inference algorithms have been successfully applied to a large number of symmetric graphical models.

Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data

no code implementations25 Nov 2014 Guy Van den Broeck, Karthika Mohan, Arthur Choi, Judea Pearl

In contrast to textbook approaches such as EM and the gradient method, our approach is non-iterative, yields closed form parameter estimates, and eliminates the need for inference in a Bayesian network.

Understanding the Complexity of Lifted Inference and Asymmetric Weighted Model Counting

no code implementations13 May 2014 Eric Gribkoff, Guy Van Den Broeck, Dan Suciu

In this paper we study lifted inference for the Weighted First-Order Model Counting problem (WFOMC), which counts the assignments that satisfy a given sentence in first-order logic (FOL); it has applications in Statistical Relational Learning (SRL) and Probabilistic Databases (PDB).

Relational Reasoning Sentence

On the Role of Canonicity in Bottom-up Knowledge Compilation

no code implementations15 Apr 2014 Guy Van den Broeck, Adnan Darwiche

We consider the problem of bottom-up compilation of knowledge bases, which is usually predicated on the existence of a polytime function for combining compilations using Boolean operators (usually called an Apply function).

Open-Ended Question Answering

Tractability through Exchangeability: A New Perspective on Efficient Probabilistic Inference

no code implementations7 Jan 2014 Mathias Niepert, Guy Van Den Broeck

We develop a theory of finite exchangeability and its relation to tractable probabilistic inference.

Skolemization for Weighted First-Order Model Counting

no code implementations19 Dec 2013 Guy Van den Broeck, Wannes Meert, Adnan Darwiche

First-order model counting emerged recently as a novel reasoning task, at the core of efficient algorithms for probabilistic logics.

On the Complexity and Approximation of Binary Evidence in Lifted Inference

no code implementations NeurIPS 2013 Guy Van den Broeck, Adnan Darwiche

Recent theoretical results show, for example, that conditioning on evidence which corresponds to binary relations is #P-hard, suggesting that no lifting is to be expected in the worst case.

Inference and learning in probabilistic logic programs using weighted Boolean formulas

no code implementations25 Apr 2013 Daan Fierens, Guy Van Den Broeck, Joris Renkens, Dimitar Shterionov, Bernd Gutmann, Ingo Thon, Gerda Janssens, Luc De Raedt

This paper investigates how classical inference and learning tasks known from the graphical model community can be tackled for probabilistic logic programs.

Cannot find the paper you are looking for? You can Submit a new open access paper.