Search Results for author: John Winn

Found 9 papers, 0 papers with code

Learning Direct Optimization for Scene Understanding

no code implementations18 Dec 2018 Lukasz Romaszko, Christopher K. I. Williams, John Winn

We develop a Learning Direct Optimization (LiDO) method for the refinement of a latent variable model that describes input image x.

Scene Understanding

Alexandria: Unsupervised High-Precision Knowledge Base Construction using a Probabilistic Program

no code implementations AKBC 2019 John Winn, John Guiver, Sam Webster, Yordan Zaykov, Martin Kukla, Dany Fabian

The use of a probabilistic program allows uncertainty in the text to be propagated through to the retrieved facts, which increases accuracy and helps merge facts from multiple sources.

Vocal Bursts Intensity Prediction

Just-In-Time Learning for Fast and Flexible Inference

no code implementations NeurIPS 2014 S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, John Winn

Much of research in machine learning has centered around the search for inference algorithms that are both general-purpose and efficient.

Consensus Message Passing for Layered Graphical Models

no code implementations27 Oct 2014 Varun Jampani, S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, John Winn

Generative models provide a powerful framework for probabilistic reasoning.

Decision Jungles: Compact and Rich Models for Classification

no code implementations NeurIPS 2013 Jamie Shotton, Toby Sharp, Pushmeet Kohli, Sebastian Nowozin, John Winn, Antonio Criminisi

Randomized decision trees and forests have a rich history in machine learning and have seen considerable success in application, perhaps particularly so for computer vision.

Classification General Classification

Learning to Pass Expectation Propagation Messages

no code implementations NeurIPS 2013 Nicolas Heess, Daniel Tarlow, John Winn

Expectation Propagation (EP) is a popular approximate posterior inference algorithm that often provides a fast and accurate alternative to sampling-based methods.

Gates

no code implementations NeurIPS 2008 Tom Minka, John Winn

We present general equations for expectation propagation and variational message passing in the presence of gates.

Cannot find the paper you are looking for? You can Submit a new open access paper.