no code implementations • AKBC 2021 • John Winn, Matteo Venanzi, Tom Minka, Ivan Korostelev, John Guiver, Elena Pochernina, Pavel Mishkov, Alex Spengler, Denise Wilkins, Sian Lindley, Richard Banks, Sam Webster, Yordan Zaykov
The knowledge discovery process uses a probabilistic program defining the process of generating the data item from a set of unknown typed entities.
no code implementations • 18 Dec 2018 • Lukasz Romaszko, Christopher K. I. Williams, John Winn
We develop a Learning Direct Optimization (LiDO) method for the refinement of a latent variable model that describes input image x.
no code implementations • AKBC 2019 • John Winn, John Guiver, Sam Webster, Yordan Zaykov, Martin Kukla, Dany Fabian
The use of a probabilistic program allows uncertainty in the text to be propagated through to the retrieved facts, which increases accuracy and helps merge facts from multiple sources.
no code implementations • 7 Nov 2016 • Liwen Zhang, John Winn, Ryota Tomioka
We propose the Gaussian attention model for content-based neural memory access.
no code implementations • NeurIPS 2014 • S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, John Winn
Much of research in machine learning has centered around the search for inference algorithms that are both general-purpose and efficient.
no code implementations • 27 Oct 2014 • Varun Jampani, S. M. Ali Eslami, Daniel Tarlow, Pushmeet Kohli, John Winn
Generative models provide a powerful framework for probabilistic reasoning.
no code implementations • NeurIPS 2013 • Jamie Shotton, Toby Sharp, Pushmeet Kohli, Sebastian Nowozin, John Winn, Antonio Criminisi
Randomized decision trees and forests have a rich history in machine learning and have seen considerable success in application, perhaps particularly so for computer vision.
no code implementations • NeurIPS 2013 • Nicolas Heess, Daniel Tarlow, John Winn
Expectation Propagation (EP) is a popular approximate posterior inference algorithm that often provides a fast and accurate alternative to sampling-based methods.
no code implementations • NeurIPS 2008 • Tom Minka, John Winn
We present general equations for expectation propagation and variational message passing in the presence of gates.